/Azure-Data-Warehouse-For-Bike-Share-Data-Analytics

Building an Azure Data Warehouse for Bike Share Data Analytics - using Azure Synapse Analytics

Primary LanguageTSQL

Azure-Data-Warehouse-For-Bike-Share-Data-Analytics

Building an Azure Data Warehouse for Bike Share Data Analytics - using Azure Synapse Analytics

Divvy is a bike sharing program in Chicago, Illinois USA that allows riders to purchase a pass at a kiosk or use a mobile application to unlock a bike at stations around the city and use the bike for a specified amount of time. The bikes can be returned to the same station or to another station. The City of Chicago makes the anonymized bike trip data publicly available for projects like this where we can analyze the data. The dataset looks like this:

image

The goal of this project is to develop a data warehouse solution using Azure Synapse Analytics. We will:

  • Design a star schema based on the business outcomes listed below;
  • Import the data into Synapse;
  • Transform the data into the star schema;
  • and finally, view the reports from Analytics.

The business outcomes we are designing for are as follows:

  1. Analyze how much time is spent per ride
  • Based on date and time factors such as day of week and time of day
  • Based on which station is the starting and / or ending station
  • Based on age of the rider at time of the ride
  • Based on whether the rider is a member or a casual rider
  1. Analyze how much money is spent
  • Per month, quarter, year
  • Per member, based on the age of the rider at account start
  1. Analyze how much money is spent per member
  • Based on how many rides the rider averages per month
  • Based on how many minutes the rider spends on a bike per month

STEPS FOR THE PROJECT

Task 1: Create Azure resources

  • Create an Azure Synapse workspace
  • Create a Dedicated SQL Pool and database within the Synapse workspace

Task 2: Design a star schema

Based on the given set of business requirements the following star schema was designed image

Task 3: Create the data in PostgreSQL

To prepare the environment for this project, we first must create the data in PostgreSQL. This will simulate the production environment where the data is being used in the OLTP system. This can be done using the Python script ProjectDataToPostgres.py

We can verify that our data exists by using pgAdmin

image

Task 4: EXTRACT the data from PostgreSQL

In our Azure Synapse workspace, we will use the ingest wizard to create a one-time pipeline that ingests the data from PostgreSQL into Azure Blob Storage. This will result in all four tables being represented as text files in Blob Storage, ready for loading into the data warehouse.

4.1 Create New linked service in Azure synapse studio

image

4.2 Create new linked service for Azure Blob Storage

image

4.3 Ingest Data into Blob Storage

First, we use Copy data tool image

Source data will be our posgresql data and we select the required tables image

Destination for our data is our Azure Blob Storage image

Pipeline successfully created image

4.4 Data Successfully loaded in our Container in our Storage account

image

Task 5: LOAD the data into external tables in the data warehouse

Once in Blob storage, the files will be shown in the data lake node in the Synapse Workspace. From here, we can use the script generating function to load the data from blob storage into external staging tables in the data warehouse we created using the Dedicated SQL Pool.

5.1 Create external table

image

External table staging_payment

image

External table staging_rider

image

External table staging_station

image

External table staging_trip

image

All the External Tables are available in our SQL database in our Workspace

image

Task 6: TRANSFORM the data to the star schema

We will write SQL scripts to transform the data from the staging tables to the final star schema we designed. The scripts for the facts and dimension tables can be found in the Transform_star_schema folder

Create Station Dimension Table

image

Create riders dimension table

image

We have all the dimension and fact table in our sql databse which we can now use to answer our business questions

image