Star schema - data warehouse for orders

Environment:

Visual Studio MicrosoftSQLServer

ETL (Extract, Transform and Load) process

Star schema database is organized into a central fact table which includes measures of interest and is surrounded by dimension tables that describe the attributes of the measures.

1. Transforming / extracting data (SSIS - SQL Server Integration Services 2008)

File: ETL_orders.sln

Create database (StudentId) and necessary tables (create_tables_orders.sql). Use staging tables (CustomerSt etc) to save informations from the data sources (salesstore.xls). Data is sent to individual tables asynchronously so that changes to individual staging tables can be made (e.g. repeated) independently without damaging the entire solution.
❗ Execute SQL task for every staging table after loading it into fact table (it takes also less space - anyway, later we load this historical data to data warehouse)
TRUNCATE removes all rows from a table, but the table structure and its columns, constraints, indexes, and so on remain (deletes the data inside a table, but not the table itself)

truncate table TabelName;

Control flow

Data flow task
(use data conversions -> load into staging)

2. Loading dimension and fact tables to data warehouse

At this step, we transfer data from staging tables to the data warehouse (one process called initial load).

3. Building cube

File: multidimensional_cube_orders.sln


Deploy