The provided Notebooks serve as an interactive introduction to the Microsoft Data Science experience by randomly generated data. While they don't execute any machine learning models, they offer a hands-on opportunity to explore and familiarize yourself with the tool's data wrangling and visualization capabilities.

image

Microsoft Fabric is revolutionizing the way data science is done. With its comprehensive suite of tools, users can now complete end-to-end data science workflows quickly and easily. From data exploration and preparation to experimentation, modeling, and scoring, Microsoft Fabric has you covered. Plus, it allows you to serve predictive insights to BI reports, giving you the insights you need to make informed decisions. With Microsoft Fabric, data science has never been easier.

To enable everyone to start with the notebook experience, below some Notebooks that are using random/public data and generating a SPARK table in your Lakehouse.

Use Case description regarding the different Notebooks:

  • Clinical Trials: An example for someone who is working in life science/healthcare and wants to explore clinical trails data and checking different study types and also analyzing missing values.
  • ESG: Environmental, Social and Governance investing is used to screen investments based on corporate policies and to encourage companies to act responsibility. With the ESG Notebook, think about analyzing the carbon footprint of different companies and regions and getting a heatmap. Also, imagine you can now include in your lakehouse other sources.
  • Chocolate Production: Focus is on Manufacturing and where with this Notebook, you could also include event driven data from a production location. The example shows weekly production and sales.
  • Modern Finance: This example is more in the direction of a Finance department. Ongoing revenue and predictive forecasting. Imagine you could integrate SAP data or other sources and this managing on a business level.

Prerequisites

Prepare your system for the data science experience.

Import notebooks

To enable YOU to start directly with the notebook experience, I have created some Notebooks that you can execute by using random/public data and generating a SPARK table in your Lakehouse.

  • Switch to the Data Science experience using the experience switcher icon at the left corner of your homepage. image

  • On GitHub, Download the Notebooks image

  • On the Data science experience homepage, select Import notebook and upload the notebook files image

  • Once the Notebooks are imported, select Go to workspace in the import dialog box.

Attach a lakehouse to the notebooks

To demonstrate Fabric lakehouse features, this Data Science Experience require attaching a default lakehouse to the notebooks.

  1. Select Add lakehouse in the left pane and select Existing lakehouse to open the Data hub dialog box.
  2. Select the workspace and the lakehouse you intend to use with these tutorials and select Add.
  3. Once a lakehouse is added, it's visible in the lakehouse pane in the notebook UI where tables and files stored in the lakehouse can be viewed.

image

You should have the following view. Of course, based on your naming / Lakehouse

image

Lakehouse Explorer

Now you should be able to explore the data on the left or use the Data Wrangler to get familiar. This feature is designed to onboard newer data scientists and to accelerate pro developers. image

Lakehouse Datasets

By browsing to your Lakehouse you should be able now to see the dataset created from the Notebook. And from here you can create a report and visualize the data. image

Report

Below no the final result on how you can create report (in this example auto-created) and where you can change and adapt the right visuals and filters. image