/Building-an-Azure-Data-Lake-for-Bike-Share-Data-Analytics

The goal of this project is to develop a data lake solution using Azure Databricks using a lake house architecture

Primary LanguageJupyter Notebook

Building-an-Azure-Data-Lake-for-Bike-Share-Data-Analytics

The goal of this project is to develop a data lake solution using Azure Databricks using a lake house architecture

In this project, you'll build a data lake solution for Divvy bikeshare.

Divvy is a bike sharing program in Chicago, Illinois USA that allows riders to purchase a pass at a kiosk or use a mobile application to unlock a bike at stations around the city and use the bike for a specified amount of time. The bikes can be returned to the same station or to another station. The City of Chicago makes the anonymized bike trip data publicly available for projects like this where we can analyze the data. The dataset looks like this:

image

The goal of this project is to develop a data lake solution using Azure Databricks using a lake house architecture. We will:

  • Design a star schema based on the business outcomes listed below;
  • Import the data into Azure Databricks using Delta Lake to create a Bronze data store;
  • Create a gold data store in Delta Lake tables;
  • Transform the data into the star schema for a Gold data store;

The business outcomes we are designing for are as follows:

1. Analyze how much time is spent per ride

  • Based on date and time factors such as day of week and time of day
  • Based on which station is the starting and / or ending station
  • Based on age of the rider at time of the ride
  • Based on whether the rider is a member or a casual rider

2. Analyze how much money is spent

  • Per month, quarter, year
  • Per member, based on the age of the rider at account start

3. Analyze how much money is spent per member

  • Based on how many rides the rider averages per month
  • Based on how many minutes the rider spends on a bike per month

Star Schema Design

Star schema I constructed based on the business questions: image

Extract Step

Step 1 Create Azure Databricks workspace

image

Step 2 Create Spark cluster in Databricks Workspace

image

Step 3 Upload data to DBFS

image image

Step 4 Extract information from CSV files stored in Databricks and write it to the Delta file system.

Load data into DataFrame image

Get data from DataFrame to Delta Lake image

Data Successfully loaded to delta files image

Load Step

Create Tables & Load data from Delta Files image

Tables have successfully been created image

Transform Step

Create riders dimension image

Station dimension image

We do the same process for the other dimension and Facts tables All tables have been successfully created image