Create ‘Connected Service’ cases in Dynamics 365 using an IoT device and Microsoft Azure

“Connected Service” is the name we give to the concept of a true end-to-end connection between an IoT-enabled device (such as a modern air conditioning unit or a lift) and the customer service department that would despatch an engineer to repair the device should it break down.

The idea is that, because the IoT device is both connected to the internet and has a bunch of sensors that monitor its temperature, vibration etc., it is able to send ‘health’ messages to its service department who can analyse this health data and decide whether to repair or swap out any given component.  This predictive maintenance model enables continuous, uninterrupted operations, without the down-time resulting from the more traditional and reactive break-fix service model.

This blog post explains how to set up a demo of this concept using a Raspberry Pi, Azure services and Dynamics 365.

Expected Outcome

wp_20161203_18_07_12_procopyOnce successfully set up, you’ll have a Raspberry Pi with a SenseHat that will trigger a service case to be created in Dynamics 365 when the humidity sensor reports humidity greater than 45%.

case

Connecting the dots…

This is the chain of events that will be triggered by breathing on the Pi’s SenseHat:

  1. Every second, the Pi SenseHat captures temperature, humidity and pressure data and sends this to an Azure IoT Hub
  2. The Azure IoT Hub authenticates the Pi device and ingests the data, making the data available for the next stage…
  3. …which is an Azure Stream Analytics job.  This job is a simple SQL query that says “if humidity is greater than 45, pass the data to an Azure Service Bus”
  4. The Azure Service Bus queues the data in a service bus message, ready to be picked up by…
  5. …an Azure Logic App that integrates to Dynamics 365 and creates a new Dynamics 365 Device Alert record that contains all the data sent from the Pi
  6. Once the Device Alert record has been created, Dynamics 365’s workflow kicks in to create the service case

Prerequisites

  1. A Raspberry Pi 2 or 3, and a SenseHat
  2. An Azure subscription
  3. A Dynamics 365 instance that has two custom entities (Device and Device Alert) that are related to the default Case entity as follows:iotdatamodel
  4. The Device Alert entity should be of the type: Activity
  5. The Device Alert form in Dynamics 365 should contain the following fields:devicealertform
  6. The Device form in Dynamics 365 can be a lot simpler.  The only field that’s really necessary is the Device Name itself, the rest of the fields are optional:deviceform
  7. The Raspberry Pi should have Windows 10 IoT Core installed, which you can download and install through the Windows 10 IoT Core Dashboard
  8. The Raspberry Pi should be provisioned in Azure through the IoT dashboard

provision

  1. The Raspberry Pi should be connected to your network
    1. You can connect from the default Windows 10 app once you’ve installed Windows 10 IoT Core
  2. And most importantly, you need to have followed Dimitris Gkanatsios’s fantastic blog article in order to create the Windows 10 app on the Raspberry Pi.  If you’ve successfully got to the “congratulations” and you’re sending data to Azure, you’re good to go!

NOTE: I made two minor additions to Dimitris’s code.

Firstly, in SenseHatData.cs I added two additional strings, DeviceName and DeviceGUID:

sensehat

Secondly, in MainPage.xaml.cs I added data for the Location, DeviceName and DeviceGUID strings:

mainpage_ink_li

You can get the GUID once you’ve completed Prerequisite 6 and have created a Device record for your Raspberry Pi.  To easily find the GUID, see this article here.

The reason for adding the GUID in the code (which does seem slightly odd) is because it’s required by the Get Record action in the Azure Logic App to lookup the Device record in Dynamics 365.  The Action requires the GUID rather than the Device record name because the record name may not necessarily be unique.

Step 1 – Create the Azure Service Bus queue

First off, let’s create the Azure Service Bus queue.

  1. In Azure, click New, then search for Service Bus, then click Create
  2. Enter a Name, Pricing Tier, Resource Group and Location, then click Create
    servicebus
  3. Click + Queue, give the queue a name, keep all the default settings, click Create
  4. Open the queue, click Shared access policies and add the following policies for Send and Listen
    sharedaccesspolicies
  5. In the main blade of the service bus, click Connection Stringsconnectionstrings
  6. Click RootManageSharedAccessKey
  7. Copy the Connection String – Primary Key and paste it somewhere safe to use later

Step 2 – Create the Azure Stream Analytics job

Now we’re going to create a new Azure Stream Analytics job, (i.e. in addition to the one created in Dimitris’s blog).

  1. In Azure, click New, then search for Steam Analytics job, then click Create
  2. Enter a Job Name, Subscription, Resource Group and Location, then click Create

streamanalyticsjob

  1. While the job is deploying, go to your IoT Hub, click Shared access policies and copy the primary key for the iothubowner role
  2. Once the job has deployed, click Inputs, click Add, complete the New Input blade as follows, click Create
    input2
  1. Now click Outputs, click Add, select Sink: Service Bus Queue, then complete the rest of the New Output blade as follows adding the details of the service bus queue you created in Step 1
    output
  2. Click Query, add the following T-SQL, click Savequery
  3. Start the job

Step 3 – Create the Azure Logic App and add the service bus connection

A Logic App is essentially a workflow tool with pre-built connectors to popular services, such as Twitter.  It’s an incredibly simple way to create system-to-system integrations without using complex code.  In this step we’ll create an Azure Logic App and connect it to our service bus.

  1. In Azure, click New, then search for Logic App, then click Create
  2. Enter a Name, Pricing Tier, Resource Group and Location, then click Create
    logicapp
  3. Click Blank LogicApp
  4. In the search box, enter Service bus to filter the list
  5. Select Service Bus – When a message is received in a queue (auto-complete)
  6. In the Connection Name field, enter the name of the service bus itself (NOT the connection name!)

servicebusconnection

  1. In the Connection String field, paste the string you copied in Step 1
  2. Click Create
  3. Once the connection has been created, enter the queue name you created in Step 1
  4. Set the frequency to 1 second
    la1

Step 4 – Add a Compose action in the Logic App

The below Compose action is necessary to remedy a known feature/bug within Azure whereby the JSON data in the service bus message gets inadvertently wrapped with an XML header.  This prevents the subsequent actions in the logic app from extracting the JSON payload, so we employ this Compose action to strip the message of the unwanted XML wrapper.

  1. Click + New Step, click Add an action
  2. Select Compose
  3. In the Inputs field add the following code:
@json(substring(base64toString(triggerBody()['ContentData']), indexof(base64toString(triggerBody()['ContentData']), '{'), sub(lastindexof(base64toString(triggerBody()['ContentData']), '}'), indexof(base64toString(triggerBody()['ContentData']), '{'))))

compose

Step 5 – Add a Get Record action to the Logic App

Now we use the Get Record action to perform a lookup on the Dynamics 365 Device entity. The logic app action will store the lookup values from the Device entity and enable us to pass them into the Device Alert entity.

  1. Click + New Step, click Add an action
  2. In the search box, enter Dynamics to filter the list
  3. Select Dynamics 365 – Get record
  4. Click Sign In and enter a Dynamics 365 administrator’s username and password to create the connection to Dynamics 365
  5. Select your Organization Name and Entity Name
  6. Click inside the Item Identifier field and from the Dynamic Content dialog select the Compose: Outputs blockitemidentifieroutputs

Our JSON has been stripped of its unwanted XML wrapper using the Compose in Step 4, and now we need to specify which JSON field/value pair to use in the Get record lookup.  We do this by using the code view to make one modification to the Outputs block.

  1. In the Logic Apps Designer menu bar, click </> Code View
  2. Scroll down to the Get_record: path linegetrecordpath
  3. At the end of the path line, add “.deviceGUID” immediately after outputs(‘Compose’) so that the line ends:deviceguid
  4. In the Logic Apps Designer menu bar, click Designer.  The Item Identifier field should now look like this, i.e. showing “deviceGUID”getrecord

Step 6 – Add a Create Record action to the Logic App

Now we’ve looked up our Device record, we can create a new Device Alert and populate it with our Device values

  1. Click + New Step, click Add an action
  2. In the search box, enter Dynamics to filter the list
  3. Select Dynamics 365 – Create a new record
  4. Enter the Organization name, Entity Name and Subjectcreatenewrecord2
  5. Click Show advanced options
  6. For each of the Location, Temperature, Pressure and Humidity fields in the new Device Alert record, add the Compose: Outputs block from the Dynamic Content dialog, e.g.:pressure

Once again, we need to modify the Output block code to tell the logic app which specific JSON field/value to add to the new record

  1. In the Logic Apps Designer menu bar, click </> Code View
  2. Scroll down to the section of code starting “Create_a _new_record”
  3. Under “body” you should see four lines of code for each of the Location, Temperature, Pressure and Humidity fields – similar to this:
"body": {
   "daniel_location": "@{outputs('Compose')}",
   "daniel_temperature": "@{outputs('Compose')}",
   "daniel_pressure": "@{outputs('Compose')}",
   "daniel_humidity": "@{outputs('Compose')}",
}
  1. At the end of the humidity line, add “.humidity” immediately after outputs(‘Compose’) 
  2. At the end of the temperature line, add “.temperature” immediately after outputs(‘Compose’) 
  3. At the end of the location line, add “.location” immediately after outputs(‘Compose’) 
  4. At the end of the pressure line, add “.pressure” immediately after outputs(‘Compose’) 
  5. Your code should now look similar to this:composealter
  6. In the Logic Apps Designer menu bar, click Designer.
  7. The fields should now look similar this:createnewrecord4
  8. In the Device field, add the Get Record: Device (Unique identifier for entity instances) block from the Dynamic Content dialogdevice

At this point you should have four actions in your Logic App, like this:finallogicapp

  1. Click Save

Step 7 – Create the Dynamics 365 workflow

Thankfully, all the hard work has now been done!  All that remains is to use a Dynamics 365 workflow to create the Case once the Device Alert has been created by the Azure Logic App.

(I won’t go into the detail of creating the workflow, I’ll assume you’ve got this part covered).

Basically, create the workflow with two steps:

workflow1

For step 1 “Create Case”, add these dynamic values to the Case form:

dynamicvalues1

and for step 2 “Set Regarding in Device Alert”, add this dynamic value to the Device Alert form:

dynamicvalues2

Conclusion

Assuming you’ve got this far (yes, I appreciate it was more of a transatlantic long haul than a weekend city break!) then all you have to do now is breathe on the SenseHat to get the humidity reading above 45% and you’ll get a nice list of new Device Alerts and associated Cases in Dynamics 365:

devicealertslist

i.e. Connected Service.

daniel_good

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s