/AzureIoT

Tutorial for Azure IoT

Primary LanguageJupyter Notebook

Azure IoT Suite Hands-on Workshop

Agenda

1. Overview of solution architecture

2. Preparation / Software needed

3. Create simulator

4. IoT Hub

5. Azure Stream Analytics

6. Azure Storage

7. Azure Event Hub

8. Azure Machine Learning

9. Azure Function

10. Power BI

11. Others

The following discussion is based on normal IoT lambda architecture, and this tutorial is served as hands-on familiarization on various Azure services.

1. Overview of solution architecture

The diagram below shows a typical IoT solution reference architecture.

Reference architecture

Let's map out the respective component with respective Azure services.

Overview of solution architecture

We will start from building a simulator that sends random temperature and humidity readings into IoT Hub. Then, the data will pass thru Azure Stream Analytics, which allow us to store the data in Azure Table Storage, and at the same time, computer the average of reading value in a 10 seconds windows.

Another output from Azure Stream Analytics is send the data to Azure Event Hub, which act as a data receiver. Azure Event Hub allows high speed data ingestion with low latency. From Azure Stream Analytics, we have computed the average value of readings, and will send a message to Azure Event Hub every 10 seconds. Once Azure Event Hub receive a message, it will trigger Azure Function, which help us send the readings to our Azure Machine Learning Service to predict the status.

Concurrently, once the machine learning module predicted the alert, Azure Function moves the predicted results into Azure Table Storage for web app. The web app can be built using any language such as Java, or .Net. For this workshop purpose, we are using Power BI dashboard as our dashboard. Two tiles are pinned, which show the live streaming data of temperature and humidity. Then, a textbox is pinned to indicate the status on machine.

Noted that this solution architecture is for reference / introductory purpose, there are many different services that the team can leverage on, based on the requirement.

2. Preparation / Software needed

We are leveraging the power of cloud to perform most of the tasks, nevertheless, we need code editor to create simulator, and some light code editing that will be inserted into Azure Services.

The following application will be helpful on setting up the development environment.

  1. Visual Studio 2017
  2. Microsoft Azure Storage Explorer
  3. Device Explorer for IoT Hub devices

To complete the exercise, you will need an Azure account. You can sign up for a trial account easily here: Create Azure account.

In this tutorial, we will be using C# as the main development language. The whole idea of this tutorial is to assist the team on understand the concept of various Azure services. Once the understanding is established, the team can use any supported language for development.

During the process of installation, please choose C# or .Net to install the components.

3. Create Simulator

First, we will create a simple simulator that generate temperature and humidity readings. To do this, launch Visual Studio 2017, and create a new project by navigating to File -> New -> Project.... Under Visual C#, navigate to Windows CLassic Desktop and choose Console App (.Net Framework). Rename your project and choose the location of where the file should be stored.

Simulator setup

3.1 Manage NuGet Package

Once you created the project, on the top navigation bar, click Tools -> NuGet Package Manager -> Manage Nuget Packages for solution.... Then select Browse, search for Microsoft.Azure.Devices.Client, and select install to install the package. This procedure downloads, installs and adds a reference to the Azure IoT Service SDK NuGet package and its dependencies.

Simulator setup 2

3.2 Code

Library

Add the following using statement at the top of the Program.cs file:

using Microsoft.Azure.Devices.Client;
using Newtonsoft.Json;
IoT Device Identity

Then, add the following fields to the Program class. Note that there are 2 placeholders, and we will replace it later once we created Azure IoT Hub.

static DeviceClient deviceClient;
static string intHubUri = "{iot hub hostname}";
static string deviceKey = "{device key}";
Methods

Add the following methodto the Program class.

This method will generate random temperature and humidity readings, and then send it over to Azure IoT Hub using the connection string that we established in previous section.

private static async void SendDeviceToCloudMessagesAsync()
{
    double minTemperature = 25;
    double minHumidity = 63;
    int messageId = 1;
    Random rand = new Random(234); //you can change the random seed by change the number
    int messageCount = 0; //we will send a fixed number of message. Alternatively, this can be removed.

    while (messageCount < 300)
    {
        double currentTemperature = minTemperature + rand.NextDouble() * 15;
        double currentHumidity = minHumidity + rand.NextDouble() * 20;

        var telemetryDataPoint = new
        {
            messageId = messageId++,
            deviceId = "firstSimulator",
            temperature = currentTemperature,
            humidity = currentHumidity
        };
        var messageString = JsonConvert.SerializeObject(telemetryDataPoint);
        var message = new Message(Encoding.ASCII.GetBytes(messageString));
        message.Properties.Add("temperatureAlert", (currentTemperature > 30) ? "true" : "false");

        await deviceClient.SendEventAsync(message);
        Console.WriteLine("{0} > Sending message: {1}", DateTime.Now, messageString);

        messageCount++;
        await Task.Delay(500); //the message is sent every 500ms
    }
}

This method sends a new device-to-cloud message every 0.5 second. The message contains a JSON-serialized object, with the device ID and randomly generated numbers to simulate a temperature sensor and a humidity sensor.

Main Method

Lastly, add the following lines to the Main method:

Console.WriteLine("Simulated device\n");
deviceClient = DeviceClient.Create(iotHubUri, new DeviceAuthenticationWithRegistrySymmetricKey("myFirstDevice", deviceKey), TransportType.Mqtt);

SendDeviceToCloudMessagesAsync();
Console.ReadLine();

Note that when we establish connection with Azure IoT Hub, we are using MQTT protocol, other supported protocol includes HTTP and AMQP.

Note that if you want to use HTTP protocol, you should also add the Microsoft.AspNet.WebApi.Client NuGet Package to the project and include the System.Net.Http.Formatting namespace.

4. Azure IoT Hub

Now, let us look at device connectivity, which is the most important part for the whole solution. When we look at Device Connectivity, there are several ways where we can connect the device to Azure.

Image

Let's take a closer look on all 4 different connection.

1. Direct Device Connection

The device or IoT devices has internet connection and is possible to connect to Azure directly. Devices such as Raspberry Pi, Intel Edisson, or other enterprise grade devices are capable of doing this.

2. Field Gateway

If the environment requires a custom field gateway, then the devices can connect tp the field gateway directly, and from gateway, the system will process the data and send it over to Azure.

3. Custom Cloud Gateway

This is similar to method 1, but it will go thru another cloud gateway instead of sending telemtry to Azure directly.

4. Field Gateway + Custom Cloud Gateway

This is combination of method 1 and 2, and the connection between field gateway and custom cloud gateway is established via VPN.

Azure IoT Hub act as a cloud gateway as pointed in the diagram above. Now, let us replace cloud gateway with Azure IoT Hub and include the data path, together with available connection protocol.

Image

Why Azure IoT Hub?
  1. Designed for IoT

    1. Connect up to 10 million devices
  2. Cloud-scale messaging

    1. Device-to-cloud and Cloud-to-device
    2. Durable messages (at least once semantics)
  3. Per-device authentication

    1. Individual device identities and credentials
  4. Multi-protocol support

    1. Natively supports AMQP, HTTP, MQTT
    2. Designed for extensibility to custom protocols
  5. Service assisted communications

    1. Secure bi-directional communication
    2. Command and control
  6. Cloud-facing telemetry ingestion

    1. Delivery receipts, expired messages
    2. Device communication errors
  7. Connection multiplexing

    1. Single device-cloud connection for all communications (C2D, D2C)
  8. Multi-platform

    1. Device SDKs available for multiple platforms
    2. Multi-platform Service SDK

Setup an Azure IoT Hub

Create an Azure IoT Hub is straightforward. Login to Azure Portal, then on the left panel, choose "+" and search for "iot hub". Then you will see IoT Hub. Click create and you are prompt to enter several parameters.

image

image

Name is the name of your Azure IoT Hub. You can click on Pricing to understand the difference between each plan. Essentially, different plan cater for different number of devices and daily messages limit.

Resource Group is the place where you place all project-related Azure Services under same folder for easier navigation. We chose Southeast Asia in Location, which indicate we want to spin the services in Singapore datacenter.

This is how it's look like once you have provisioned Azure IoT Hub.

Image

Create IoT Devices

First, navigate to Shared access policy, you can see the list of policy name and what's the rights associated with each policy. Click on iothubowner, and copy the connection string - Primary Key.

Now, launch Device Explorer for IoT Hub Devices. Under the "Configutation" tab, paste the IoT Hub Connection String that we copied just now, and click "Update". Device explorer helps us to manage the devices and monitor the message between devices and Azure IoT Hub.

To create a device, go to "Management" tab, click "Create". Enter the device ID and copy the device Key. Once we created the IoT Hub, head back to our simulator app, replace the placeholder with the information.

Create IoT Hub Device

The code should looks something like this:

static string iotHubUri = "PAworkshop.azure-devices.net";
static string deviceKey = "7cccfTV3dxbQwfUFqAdsOlkQAixm+KOPLNhNngJiJ38=";

Also, in the main method, ensure that the deviceID is the same as the deviceID you have just created. In my case, my deviceID is firstSimulator.

deviceClient = DeviceClient.Create(iotHubUri, new DeviceAuthenticationWithRegistrySymmetricKey("firstSimulator", deviceKey), TransportType.Http1);

Once you have entered the placeholder, you can run the simulator and see what message is generated. To check how the message is sent to IoT Hub, head back to Device Explorer for IoT Hub Devices, choose the device that you have created, and click Monitor. Under the "Event Hub Data", you can see how the message is sent to Azure IoT Hub. The message is in the form of JSON.

Create IoT Hub

This conclude Azure IoT Hub. You can find more information here: Azure IoT Hub Documentation.

5. Azure Stream Analytics

With the events/telemetry sending to Azure via Azure IoT Hub, now let's look at the second component of the architecture, which is Event Processing.

Azure Stream Analytics offers the processing, analytics and handling of massive ammounts of real time data. Besides that, Azure Stream Analytics can be configured to be expose for the settings of rules and alarms. Azure Stream Analytics then runs these rules as it processes the incoming data ingestion and flags what needs to be escalated for attention. To configure stream analytics, usrs can use simple SQL syntax to program if-this-then-that style rules and instruction.

Azure Stream Analytics can not only handle millions of events per second but it can also correlate across multiple independent streams of data simultaneously. This high speed event processing allows for the real time detection of anomalies or escalations vased on threshold breaches or alarm settings as the data is ingested. The architecture is simple and easily scalable to an enterprise ready solution.

To create Azure Stream Analytics, go to Azure portal. On the left hand panel, click "+", search for "stream analytics", and click "Stream Analytics job". Click create, and fill up the name and choose the resource group. Here, you can choose the same resource group as Azure IoT Hub.

image

Once Azure Stream Analytics is created, you will see the screen as below. 3 important components of Azure Stream Analytics are:

  1. Inputs
  2. Query
  3. Outputs

We will configure all 3 components to enable the streaming job.

image

5.1 Inputs

First, we configure the input of Azure Stream Analytics. We will use the event from Azure IoT Hub as our input. Click the "Inputs" box, on the top panel, click "+ Add". Give the "Input alias" a name, and the "Source Type" choose "Data Stream".

In "Import option", choose "Use IoT hub from current subscription", and choose the IoT hub that we just created. For "Shared access policy name", choose "iothubowner". Note that the "Event serialization format" should be "JSON", as the simulator sends telemetry in JSON format. Once done, click "Create".

image

In this exercise, we are creating a single input. However, Azure Stream Analytics allows multiple inputs, and user can configure respectively.

5.2 Outputs

To configure output, the process is similar. Users create the output depends on the requirement. In this exercise, we will create an output to store every single telemetry in Azure Table Storage, and stream the data to Power BI for visualization. Finally, we will calculate the average value of the sensors in a 10-seconds windows, and send it to Azure Event Hub as new message, which will then be push to Azure Machine Learning for analysis.

5.2.1 Table Storage

On the output panel, click "+ Add", give the output a name, and under "Sink", choose "Table Storage". Choose "Use table storage from current subscription", and choose the storage account that we created. Please refer to next section on how to create Azure Storage account.

Give the table a name, and under "Partition key", enter "deviceId". For "Row key", enter "messageId". Both "deviceId" and "messageId" are actually one of the parameter when the simulator is sending the telemetry to Azure IoT Hub, here we are just using them to identify which device is sending what data.

Once done, click "Create".

image

5.2.2 Power BI

Similarly, now we add Power BI as output stream. Give the output stream a name, and choose the "Sink" to be "Power BI". Then, you will be prompt to authorize Power BI account. Login with your Power BI account. Then choose "Group Workplace" of your choice, and give both "Dataset Name" and "Table Name" a name for visualization purpose.

Once done, click "Create".

image

5.2.3 Event Hub

Again, we create Azure Event Hub as third output. Give the output a name, choose "Event hub" as sink and for the import option, choose "Provide event hub settings manually". For "Service bus namespace" and "Event hub name", please refer to section 7 on how to get the name. Enter "RootManagerSharedAccessKey" under "Event hub policy name", and paste the key, which can be obtained from Azure Event Hub. For "partition key column", enter "deviceId".

Make sure that JSON is selected under "Event serialization format", and choose Line separated under "Format".

Once done, click "Create".

image

5.3 Query

Now, with 3 different output stream, we will define the query and output the respective data. Click on the "Query" box. Let's start from stream the telemetry to Azure Table Storage.

First, we define the input stream. We are using SQL-like language while creating the query.

WITH [StreamData] AS (
SELECT *
FROM
[IoTHub])

We define the input stream as StreamData, while the source is IoTHub, where we have define the input name previously. Now, we will select messageId, deviceId, temperature, and humidity into the table. Insert the following code below the code that we entered previously.

SELECT
    messageId,
    deviceId,
    temperature,
    humidity
INTO
    [TelemetryTable]
FROM
    [StreamData]

Now for Power BI stream, we will be using similar code, but noticed that previous SQL statement didn't include time, but we need time variable in order to visualize the telemetry data. We just need to add "System.Timestamp time" to indicate additional variable.

SELECT
    messageId,
    deviceId,
    temperature,
    humidity,
    System.Timestamp time
INTO
    [PowerBIStream]
FROM
    [StreamData]

Lastly, we want to calculate the average value of telemetry data in a time frame of 10 seconds, and send them as 1 message to Azure Event Hub. Noticed that we group the telemetry data using "TumblingWindow". This is window function within Azure Stream Analytics, check here for more information: Azure Stream Analytics Windows Functions.

SELECT
    deviceId,
    AVG(temperature) AS temperature,
    AVG(humidity) AS humidity,
    System.Timestamp time
INTO
    [TelemetrySummary]
FROM
    [StreamData]
GROUP BY
    deviceId,
    TumblingWindow(second,10)

Once done, click "save". To verify the query and test the connection, go back to Azure Straem Analytics page, and click "Start" on the top panel. Then, run the simulator app that we created earlier on. If the query is successful, we will see the telemetry in Azure Table Storage and Azure Event Hub will receive messages.

Azure Table Storage Image

Event Hub incoming messages image

Click here to find out more about Azure Stream Analytics. You can check other Query examples for common stream analytics usage pattern.

6. Azure Storage

Azure Table Storage enables new scenarios for applications that requires scalable, durable, and highly available storage for their data.

The advantages of Azure Table Storage are:

  1. Massively scalable
  2. Elastic
  3. Auto-partitioning system
  4. Accessible from anywhere in the world
  5. Multi-platform support

An Azure Table Storage provides 4 different type of services, which is

  1. Blob Storage
  2. Table Storage
  3. Queue Storage
  4. File Storage

Find out more about Azure Table Storage here

To create Azure Storage, go to Azure portal, on the left hand panel, click "+", search for "azure storage", and click "Storage account - blob, files, table, queue". Click create, and fill up the information. Again, you can choose the same resource group as previous services. Note that on "Replication", you have several options, and choose the one that suites your requirement.

image

Now, let us navigate to Microsoft Azure Storage Explorer. On the left hand panel, click on the second icon and login using the Azure account. Once the storage account is created, you should be able to see the list of services under the storage account.

image

7. Azure Event Hub

Azure Event Hub is a highly scalable data streaming platform and event ingestion service capable of receiving and processing millions of events per second. Azure Event Hubs can process and store events, data, or telemetry produced by distributed software and devices. Data sent to an event hub can be transformed and stored using any real-time analytics provider or batching/storage adapters. With the ability to provide publish-subscribe capabilities with low latency and at massive scale, Azure Event Hubs serves as the "on ramp" for Big Data.

To create Azure Event Hub, go to Azure portal, on the left hand panel, click "+", search for "event hub", and click "Event Hubs". Click create, and fill up the information. Give it a name, and again, choose the same resource group. The name given here is the name for this service bus, the name is needed when integrating to Azure Stream Analytics.

image

This step will create a service bus collection, so now we will create an Event Hub to receive the message. First, navigate to your event hub service, on the top panel, click "+ Event Hub".

image

Give the event hub a name, and choose configure the "Message Retention" and "Archive" if needed. Once done, click "Create". With that, we have created an Event Hub.

To enable integration to Azure Event Hub, on the left panel, click "Shared access policy", then select "RootManageSharedAccessKey". Here, you can find the connection string to integrate Azure Stream Analytics into Azure Event Hub.

image

When integrating with Azure Stream Analytics, we setup Azure Event Hub to be capable of receiving event. For sending event to other services, please check Azure Function section.

This concludes Azure Event Hub, to find out more, click here.

8. Azure Machine Learning

In this section, we will look at how we setup Azure Machine Learning and publish it as web services.

First, navigate to Azure ML Studio, and login using the Azure account. Then, click here to download training dataset. Then upload the training dataset in ML studio by clicking at left bottom corner "+" sign, choose "Dataset", then click "From Local File".

Now, let's understand the data before we start to train the model.

First, let's list out 10 rows of the dataset and plot the data point in scatter plot. We want to check the correlation between temperature and humidity with the alert.

import pandas as pd

data = pd.read_csv('https://msinternalportalborfo3vz.blob.core.windows.net/fileshare/Telemetry%20Training.csv?st=2017-06-25T17%3A57%3A00Z&se=2019-06-26T17%3A57%3A00Z&sp=rwl&sv=2015-12-11&sr=c&sig=r6lPaduQM9aRz4pMjPNkDWUpLu9ZEf59ankncWTEcqs%3D')

print(data.temperature.mean())
print(data.humidity.mean())
data.head(10)

Image

From the scatter plot, we can see that "red" or "warning" is concentrated on top right corner, which indicate that when temperature and humidity is high, the alert will become "warning".

import pandas as pd
import numpy as np
import matplotlib.pyplot as plt
import plotly.plotly as py

data = pd.read_csv('https://msinternalportalborfo3vz.blob.core.windows.net/fileshare/Telemetry%20Training.csv?st=2017-06-25T17%3A57%3A00Z&se=2019-06-26T17%3A57%3A00Z&sp=rwl&sv=2015-12-11&sr=c&sig=r6lPaduQM9aRz4pMjPNkDWUpLu9ZEf59ankncWTEcqs%3D')
x = data.temperature
y = data.humidity
z = data.Alert

indices = z == 1
plt.scatter(x[indices],y[indices],marker='x',color='r')
plt.scatter(x[~indices],y[~indices],marker = 'o',color='b')
plt.title('Scatter plot of data point')
plt.xlabel('temperature')
plt.ylabel('humidity')
plt.xlim([min(x)-0.5,max(x)+0.5])
plt.ylim([min(y)-0.5,max(y)+0.5])
plt.show()

image

In this exercise, we are doing Two-class classification. From the scatter plot, it seems like the correlation is linear, thus one potential candidate is Logistic Regression. Let's try to run the regression in Python.

import pandas as pd
import numpy as np
import matplotlib.pyplot as plt
import plotly.plotly as py
from patsy import dmatrices
from sklearn.linear_model import LogisticRegression
from mpl_toolkits.mplot3d import Axes3D

data = pd.read_csv('https://msinternalportalborfo3vz.blob.core.windows.net/fileshare/Telemetry%20Training.csv?st=2017-06-25T17%3A57%3A00Z&se=2019-06-26T17%3A57%3A00Z&sp=rwl&sv=2015-12-11&sr=c&sig=r6lPaduQM9aRz4pMjPNkDWUpLu9ZEf59ankncWTEcqs%3D')
x = data.temperature
y = data.humidity
z = data.Alert

indices = z == 1
fig = plt.figure()
ax = fig.add_subplot(111, projection='3d')
ax.scatter(x[indices], y[indices], z[indices], c='r', marker='o')
ax.scatter(x[~indices], y[~indices], z[~indices], c='b', marker='^')
ax.set_xlabel('temperature')
ax.set_ylabel('humidity')
ax.set_zlabel('alert')
plt.show()

y,X = dmatrices('Alert ~ temperature + humidity',data, return_type="dataframe")
y = np.ravel(y)

model = LogisticRegression()
model = model.fit(X,y)

print(model.score(X,y))

image

The logistic regression gave a scoring of 0.862, looks fine to use this algoritm. With this code, we will use Azure Machine Learning to reproduce similar algorithm and deploy it as web services.

In previous exercise, we have uploaded the training dataset into Azure ML Studio. Now, from Azure ML Studio, on the bottom left corner, click "+ New", choose "Experiment" tab, and click on "Blank Experiment".

8.1 Import Data

On the left hand side panel, navigate to "Saved Dataset", and drag it to the canvas. To check the dataset, right click the small dot below the box, and click "Visualize". Here you can see a quick summary of the data.

Image

8.2 Data Transformation

Here, we will only require 3 columns, which is temperature, humidity, and alert. Also, the variable alert is not categorical, so we need to transform it.

Similarly, from the left panel, under "Data Transformation", choose "Select Columns in Dataset", and drag to the canvas. Connect the dot between the dataset and the dot above the column selection.

Next, we will transform the data to categorical data. Again, from the "Data Transformation", choose "Edit Metadata", and drag it below the canvas. Then click the box and edit the setting. For "Data type" and "Fields", choose "Unchanged", and for "Categorical", select "Make categorical".

8.3 Data Partition

Once the data is transformed, let's do the partition. Under "Data Transformation", choose "Partition and Sample". Again, choose "Sampling" for "Partition or sample mode", and enter the "Rate of sampling" of your choice.

8.4 Data Modelling

Under "Machine Learning", drag "Train Model" into the canvas. Again, from the sub-section of "Machine Learning", navigate to "Classification", drag "Two-Class Logistic Regression" into the canvas. Click on "Train Model", launch the column selector and choose "Alert".

Once done, under "Machine Learning" tab, navigate to "Score", choose "Score Model" and drag it into canvas. Once we complete the model scoring, we want to evaluate the model. This can be done by "Evaluate Model", which can be found under "Evaluate" tab.

Lastly, connect the boxes and the canvas should looks something below:

Image

Then, at the bottom canvas, click "RUN". Once finished, right click the bottom dot of "Evaluate Model", and click "Visualize". Here, you can see the performance of this model.

In this case, the default threshold value is 0.5, and the accuracy is 90.3%.

Image

8.5 Publish Web Services

The final step is setup web services using this algorithm. On the bottom panel, click "SET UP WEB SERVICES", and then click "RUN" again. Once done, on the left hand panel, click the "Globe", which is "Web Sercices". You will be able to find the web services, together with API Key.

To learn how to use this API, click "REQUEST/RESPONSE", and there's sample code on how to integrate this into your application.

9. Azure Function

Recall from section 7 where we configure Azure Event Hub. The message sent to Azure Event Hub is the average value of temperature and humidity for a duration of 10 seconds. Now, we want to leverage on Azure Function to send the message to the machine learning module and store the result in Azure Table Storage.

Azure Function is a serverless compute service that enables developer to run code on-demand without having explicitly provision or manage infrastructure.

There are many ways to trigger Azure Function, such as

  1. Timer
  2. Http Trigger
  3. Github
  4. EventHub Trigger
  5. Blob Trigger etc

For this exercise, we use use EventHub Trigger, which means the Azure Function will be triggered when there's new message in Azure Event Hub, and the message will be sent to machine learning web services, and finally store the data in Azure Table Storage.

To create Azure Function, again, in Azure Portal, click on "+" and search for "function app". Choose "Function App" and click create.

image

Give the app a name, and under "Hosting Plan", choose "Comsumption Plan" if the usage is low, or choose "App Service Plan" if the usage is high. For "Storage", select the existing storage account that we have created.

Now, click on the Azure Function, and on the left hand panel, you will see

  1. Functions
  2. Proxies (preview)
  3. Slots (preview)

Click on the + button beside Functions. Then, at the bottom under "Get started on your oen", click "Custom function". Change the language to "C#" and scenario to "all". Choose EventHub Trigger - C#. Give this function a name, and under "Event Hub connection", connect to the Azure Event Hub that we created and replace the Event Hub name with the one we created. For the policy, choose "RootManagerSharedAccessKey". Once done, click "Create".

9.1 Configure Trigger and Output

Once the function is created, click "Integrate". We have configured Triggers when we setup the function, so now we configure the Outputs. Click "+ New Output", and select "Azure Table Storage". Again, under "Storage account connection", choose the storage account that we created previously. For the "Table name", give it a name, where results will be stored here.

image

Once done, click save. Navigate to canvas by clicking the function name.

9.2 Coding

9.2.1 Add packages

First, add the following above "using System;".

#r "Newtonsoft.Json"
#r "Microsoft.WindowsAzure.Storage"

This include the packages needed for the function.

9.2.2 Add libraries

Then, add the following libraries under "using System;".

using Newtonsoft.Json;
using System.Net;
using System.Net.Http;
using System.Net.Http.Headers;
using System.Net.Http.Formatting;
using System.Text;
using System.Threading.Tasks;
using Microsoft.WindowsAzure.Storage.Table;
9.2.3 Add JSON object

We will add the message JSON class so that the function can deserialize the JSON object. Add the following class below Run method.

Event Hub Message

public class message
{
    public string deviceid { get; set; }
    public double temperature { get; set; }
    public double humidity { get; set; }
    public string time { get; set; }
}

Table Storage Schema

public class resultTable : TableEntity
{
    public string deviceId {get; set;}
    public double temperature {get; set;}
    public double humidity {get; set;}
    public string alert {get; set;}
}

Azure ML Request JSON Object

public class StringTable
{
    public string[] ColumnNames { get; set; }
    public string[,] Values { get; set; }
}

Azure ML Response JSON Object

public class Value
{
    public List<string> ColumnNames { get; set; }
    public List<string> ColumnTypes { get; set; }
    public List<List<string>> Values { get; set; }
}

public class Output1
{
    public string type { get; set; }
    public Value value { get; set; }
}

public class Results
{
    public Output1 output1 { get; set; }
}

public class MLResult
{
    public Results Results { get; set; }
}
9.2.4 Call Azure ML Function

Recall that when we create web services for Azure Machine Learning, there's a sample code which show us how to use the API.

Now, we slightly modify the method, and insert the method before Run method.

The code as below:

public class AzureML
{
    public static async Task<string> callMLFunction(double temp, double hum)
    {
        string tempString = temp.ToString();
        string humString = hum.ToString();
        using (var client = new HttpClient())
            {
                var scoreRequest = new
                {

                    Inputs = new Dictionary<string, StringTable>() {
                        {
                            "input1",
                            new StringTable()
                            {
                                ColumnNames = new string[] {"PartitionKey", "RowKey", "Timestamp", "deviceid", "deviceid@type", "humidity", "humidity@type", "messageid", "messageid@type", "temperature", "temperature@type", "Alert"},
                                Values = new string[,] {  { "value", "0", "", "value", "value", humString, "value", "0", "value", tempString, "value", "" },  { "value", "0", "", "value", "value", "63", "value", "0", "value", "28", "value", "" },  }

                            }
                        },
                    },
                    GlobalParameters = new Dictionary<string, string>()
                    {
                    }
                };
                const string apiKey = "ML API Key"; // Replace this with the API key for the web service
                client.DefaultRequestHeaders.Authorization = new AuthenticationHeaderValue("Bearer", apiKey);    
                client.BaseAddress = new Uri("https://ussouthcentral.services.azureml.net/workspaces/49560a6cf7a64ee19712b1daaad6b208/services/f2a9b48fa0374fe480515d0e62665bb3/execute?api-version=2.0&details=true");

                HttpResponseMessage response = await client.PostAsJsonAsync("", scoreRequest);

                if (response.IsSuccessStatusCode)
                {
                    string result = await response.Content.ReadAsStringAsync();
                    MLResult resultJSON = JsonConvert.DeserializeObject<MLResult>(result);
                    string alert = resultJSON.Results.output1.value.Values[0][3];
                    if(alert == "0")
                    {
                        return "Safe";
                    }
                    else
                    {
                        return "Warning";
                    }
                }
                else
                {
                    return "Error";
                }
            }
    }

}

Noticed that here, we will return 3 different response, depends on the machine learning result. If the predicted value for "alert" is 0, then this method will return "Safe", otherwise "Warning".

9.2.5 Run Method

Now, replace the "Run" method will the following code:

public static async Task<string> Run(string myEventHubMessage, ICollector<resultTable> outputTable, TraceWriter log)
{
    message messageJSON = JsonConvert.DeserializeObject<message>(myEventHubMessage);
    string device = messageJSON.deviceid;
    double tempML = messageJSON.temperature;
    double humML = messageJSON.humidity;
    string status = await AzureML.callMLFunction(tempML, humML);
    outputTable.Add(new resultTable()
    {
        PartitionKey = "Functions",
        RowKey = Guid.NewGuid().ToString(),
        deviceId = device,
        temperature = tempML,
        humidity = humML,
        alert = status
    });
    log.Info($"C# Event Hub trigger function processed a message: {status}");
    return "done";
}

Essentially, the "Run" method will deserialize incoming message, and pass the temperature, humidity value into ML service to predict the alert. Once done, the result will then be stored in Azure Table Storage.

9.3 Testing

Once everything is setup, run the simulator, and check the log of Azure Function. You should see the predicted status below.

Azure Function Log Image

Azure Table Storage Image

With the results stored in Azure Table Storage, developer can create business application using this as database.

For more info, check out the link below:

  1. Azure Function
  2. Azure Function Rest API

10. Power BI

The final part of this solution is visualize the streaming data in Power BI.

Recall that when we setup Azure Stream Analytics, we defined "Group Workplace" and "Dataset Name". So, if we login to Power BI, on the left hand panel, we should see the "Workspaces". Select the workspace that is defined in Azure Stream Analytics.

You will see 4 tabs on top:

  1. Dashboards
  2. Reports
  3. Workbooks
  4. Datasets

Now, navigate to "Datasets", and you can see the streaming dataset that we defined.

Image

Now we want to create a dashboard view with streaming data. Go to "Dashboards", on right top corner, click " + Create" and choose "Dashboard". Give it a name, and launch the dashboard once you have created it.

On the top panel, you will see "+ Add tile". Click on it, and choose "REAL-TIME DATA CUSTOM STREAMING DATA". Click next and you should see the datasets that we defined in Azure Stream Analytics. Click on the datasets and click next.

On the "Visualization Type", choose "Line chart". Then you will need to define the graph. Choose time for "Axis", and for "Values", choose temperature. Select the Time Windows that you want.

Click next and complete the remaining details such as "Title" and "Subtitle". Once done, click apply, and the dashboard is done.

Try run the simulator and observe the line graph!

image

11. Others

This exercise serves as an exercise to understand Azure IoT offering. The architecture used in this exercise is for reference, and depends on the need, you can customize the solution.

Additional Information

  1. Microsoft Azure IoT Reference Architecture
  2. Microsoft Azure IoT Security
  3. GitHub Repository