Airflow project

add a .env file with following contents:

Screenshot

Also add another .env file, this time in your dags folder with your snowflake user, password and account data:

Screenshot

Run docker-compose up airflow-init Then docker-compose up

Note that i didnt include the 800mb csv file in the repository

You can use demo.csv for test which is essentially just 100 first rows of the original data.

Accessing the airflow:

When you see this line in your shell, that means that you can access the webserver:

Screenshot

Go to localhost:8080 Use the following username and password user: airflow password: airflow

Snowflake

Before executing the dag, you should create a warehouse with the name 'MY_WH', database with the name 'MY_DB' and lastly the schema named 'MY_SCHEMA'.

Connections

Add the following connections:

Screenshot

Running the dags

Now you can run the dag by pressing the corresponding button. (Below is the dag for the reference)

Screenshot

Resulting snowflake tables

Screenshot