Dynamics 365 App for Microsoft Teams – a first look at the preview

Ever since Microsoft Teams was launched I’ve been asked by customers when an integration to Dynamics 365 would appear.  And now it does: in the form of the Dynamics 365 (Preview) app.  First impressions?  It looks awesome!

UCIinTeams

Users can add any Dynamics 365 record into a Teams channel, enabling colleagues to view, update and collaborate on records without leaving the Teams environment. Even non-Dynamics 365 users can view records to drive collaboration across wider working groups.

Records are displayed in the latest, modern Unified Interface for an intuitive app experience that (once customers upgrade to v9) will become familiar across all devices – mobile, tablet etc.

And very nicely, a productivitybot helps users search and update their Dynamics 365 records via a chat conversation:

TeamProductivityBot

Watch the video here.

daniel_good

 

Advertisements

Managing approvals in Dynamics 365 with Microsoft Flow

I’ve been curious about the Approvals capability that is conspicuously nestled into the black navigation bar of Microsoft Flow for some time now.   Specifically, I’ve wondered whether Flow Approvals would work within the context of approving Dynamics 365 processes in an enterprise-sized organisation.

Flow is the ‘little brother’ of Logic Apps (see comparison here) and is designed for the business user (a.k.a. the Citizen Integrator) in self-service automation scenarios, so my question was: could it be used for larger scale enterprise-level approval processes too?

ApprovalsNavBar

After all, Flow offers all the components of an approval chain ready to use, so it would be a shame not to take advantage.

What’s Available…

  1. An Approval action to automate the approval process
  2. A Flow template that uses the action for a Dynamics 365 deal approval
  3. An approvals centre on the Microsoft Flow website where users can respond to approval requests
  4. A mobile app for managing approval requests on the go

The Conclusion

After checking in Azure Active Directory that my test user had an assigned manager and then simplifying the template (by removing the “Check if opportunity closed in last 24 hours” condition), the Flow successfully ran.  So far so good.  Now I wanted to check if I could deploy it across my organisation for all users, and here’s the sticking point.

Flows can be shared to other users, Office 365 Groups and Security Groups (at which point the Flow ‘converts’ to a Team Flow)…

TeamFlow

…however, these shared users all effectively become admins and have edit permissions to the Flow.  So, a rogue user wanting to bypass the approval could simply edit the Flow.

To avoid this, we could share the Flow to users in an non-admin state by sharing as Run-Only Users:

RunOnlyUsers

However, the Manage Run-Only Users option above is only available after using either the “Manually trigger a flow” or “For a selected item” triggers.

ManuallyTriggerAFlow

ForASelectedItem

Neither of these options would really work in our highly scaled, enterprise context.  For example, “Manually trigger a flow” (as the name suggests) would require the user to break away from their Dynamics 365 process and initiate a secondary process in another app, adding extra clicks, time and effort – all when we are aiming to simplify and scale.

The other consideration is throttling.  The Approval action in Flow is throttled to one request per 300 seconds, i.e. 5 minutes.

ThrottlingLimits

And although Flow can be set so that the run-only user’s connection credentials are used (rather than all the shared users using the original creator’s credentials), once every 5 minutes isn’t very generous.

ApprovalRunOnly

Summary

A nice idea…that sadly won’t easily scale.  We’ll have to keep Flow Approvals for the self-service audience and use cases they were designed for.  But it was worth a try!

So, in my next article I’m going to use Logic Apps (which are intended to scale for large enterprise) to show how a Dynamics 365 approval process could be built in Azure.

daniel_good

 

Automating Customer Interactions with Bots – Contoso Flowers, Part 1

If your organisation is seeking ways to lower the cost of its customer interactions, while simultaneously appearing to be innovative, fun and modern, then Bots should be your next new customer channel.  (Social media is sooo last year).

Microsoft’s Bot Framework is a great new toolkit that enables you to build human-like dialogs between your customers and your Bot, with the purpose of transacting a specific business scenario – for example, ordering a product.

In Part 1 of this walk-through, I’ll show you to set up the ‘Contoso Flowers’ Bot in Skype to demo a customer interacting with the Bot in order to buy some flowers.

In Part 2 (which I’ve not written yet), we’ll modify the Bot code to create an actual order record in Microsoft Dynamics 365.

 

Contoso Flowers

Contoso Flowers is a bot demo you can find in the Microsoft BotBuilder-Samples on GitHub, it makes a great demo because it showcases a whole bunch of cool Bot Framework features, including Carousels, Cards and Bing Maps integration.

BTW, I’m not taking credit for the fantastic Bot code (that credit goes to people far more clever than me!), all I’m doing is explaining how to string together the different elements involved.

 

Pre-Requisites

For this walk-through, you will need:

  1. The C# “demo-ContosoFlowers” folder downloaded from BotBuilder-Samples
  2. Microsoft Visual Studio 2015
  3. The Bot Emulator installed from here
  4. An Azure subscription
  5. Visual Studio Tools for Azure installed

 

Step 1 – Create your Bot

  1. Go to the Bot Framework, sign in, then click Register a bot
  2. In the Bot Profile section, add a Name, Handle and DescriptionRegisterBot1
  3. Upload an Icon if you wish
  4. In the Configuration section, enter https://www.bbc.co.uk

    This URL is just a placeholder, we’ll update it later once we know the URL of the Azure Web App where we deploy our bot.
    RegisterBot2
  5. Click Create Microsoft App ID and password
  6. Note these down
  7. Click Create an app password to continue
  8. IMPORTANT: note this down, you won’t get another chance!
  9. Click Finish and go back to bot framework
  10. Confirm the legal bits then click Register

Now you’ll see all the channels where we’ll later be able to deploy our Bot, cool huh?!

 

Step 2 – Bot Code

  1. Open the ContosoFlowers solution downloaded from Pre-Requisite 1 in Visual Studio 2015
  2. Open Web.config
  3. Under appSettings, enter your BotID (Bot Handle), Microsoft App ID and Microsoft App Password for the Bot you created in Step 1BotAppSettings
  4. Save
  5. Build Solution

 

Step 3 – Test in the Bot Emulator

At this point, the code should “just work” – but there’s no harm in testing anyway.

  1. In Visual Studio, select Debug, Any CPU, Microsoft Edge

    VSDebugBot
  2. Start debugging by clicking Microsoft Edge
  3. A new browser window will pop up in Edge, copy the URL of this browser window (which should be http://localhost:xxxx)URLBot
  4. Append /api/messages to the end of the URL
  5. Open the Bot Emulator from Pre-Requisite 3
  6. Paste the whole, appended URL into the endpoint URL of the Bot EmulatorBotEmulator
  7. Enter your Bot’s Microsoft App ID and Microsoft App Password created in Step 1
  8. Click Connect
  9. If all’s running well, you’ll now be able to interact with your Bot in the emulatorContosoFlowersTest
  10. Stop Debugging

 

Step 4 – Deploy your Bot Code to Azure

Now we’re happy that our bot is working as expected, we can send him (or her) up to its new home in Azure.

  1. Log on to Azure: http://portal.azure.com/
  2. Click New, Web + Mobile, Web App, then click Create
  3. Give your Web App a name, then select a Subscription, Resource Group and LocationWebAppBot
  4. Click Create
  5. Once the deployment has completed, note down the URL of the Web App (this is what we’ll be replacing our http://www.bbc.co.uk placeholder with shortly)URLofWebApp
  6. Back in Visual Studio, right-mouse click on the ContosoFlowers solution in the Solution Explorer, click Publish

    PublishBot

  7. In the Select a Publish Target dialog box, click Microsoft Azure App Service

    SelectAPublishTarget

  8. If you’ve got the Visual Studio Tools for Azure installed as per Pre-Requisite 5, and you’ve already signed into your Azure account within Visual Studio, then you’ll simply be presented with a list of your existing Azure Web Apps (in their resource groups). Select the Web App you created above and click OK

    BotPublishResouceGroups

  9. Your connection settings will now be automatically populated in the next dialog, click Publish

    BotConnectionPublish

Step 5 – Add your Bot to Skype

  1. Back in the Bot Framework, click on My bots then select your ContosoFlowers bot
  2. In the Details section, click Edit
  3. Take the URL of the Web App you created in Step 4
    1. add an “s” after “http”
    2. append /api/messages to the end of the URL
    3. use this amended URL to replace the bbc.co.uk placeholder in the messaging endpoint
  4. Click Save Changes
  5. Click Test to test the bot connection.  If all goes well you should see the “Endpoint authorization succeeded” message

    TestBotConnection

  6. In the list of channels, click the Edit button next to Skype

    SkypeChannel

  7. In the Settings dialog, ensure Enable the bot on Skype, Messages Types and Cards are all switched On

    BotSettings

  8. Click I’m done configuring
  9. Click Add to Skype

    AddToSkype

  10. Then start chatting with your Bot!

FlowerBotInteraction

Some final thoughts…

  1. I noticed the Bot doesn’t successfully add to Skype Preview (if you’re a Windows 10 user and have Skype Preview).  Best to add to ‘normal’ Skype first, then open in Skype Preview after, if this is your preferred client
  2. You’ll need to add the Bot to your Skype contacts before you can start interacting with it
  3. I assume, like me, you’re only building your bot to demo and play around with, therefore there’s no need to Publish your bot via the Bot Framework and make it publically accessible
  4. If you’re halfway through a bot conversation and want to start again, type stop
  5. In Part 2 of this post (not yet written), I’ll modify the Bot Code to integrate back to Dynamics 365

daniel_good

Understand the sentiment of your customers’ tickets with Azure Cognitive Services

Wouldn’t it be great if service Cases could be automatically categorised as Happy or Unhappy, so that the unhappy ones could be routed to a high priority queue for quick processing?

Well, Azure Cognitive Services are a group of cloud intelligence services that help you to do just that.  There are a whole bunch of cool services available (from product catalogue recommendations to speech intent analysis), and we’ll use the Sentiment Analysis capability within the Text Analytics API to create a happiness rating.

And by using an Azure Logic App to do the actual processing work, we’ll have zero coding to do.  Yes, no code = happy sentiment.

Outcome

So this is what we want to achieve:

  1. When a service Case is created…

casewithdescription

  1. An Azure Logic App will be triggered that will:
    1. Read the Description field
    2. Generate a sentiment score from 0 to 1 (where 0 is very unhappy, and 1 is very happy) based on the contents of the Description
    3. Update the Case Sentiment Score field in the Case record

sentimentscore

  1. From here you can then trigger any CRM process you like, queue routing, workflow notification etc.

Step 1 – create the sentiment score field

Create a new field in Dynamics 365 and call it Case Sentiment Score.  Add the field to your Case form.

Step 2 – create the logic app

(This step assumes you have an Azure Subscription, if not you can sign up for a free trial here).

In Azure, create a blank Logic App:

  1. Click New > Web + Mobile > Logic App
  2. Give it a Name, Subscription, Resource Group and Location
  3. Click Create

createlogicapp

  1. Open your newly created Logic App and select the Blank Logic App template

blanklogicapp

Step 3 – add the three logic app steps

  1. In the Logic Apps Designer, enter “Dyn” into the search box
  2. Select Dynamics 365 – When a record is createdla_step1
  1. Sign in to your Dynamics 365 org to create the connection between Azure and Dynamics 365
  2. Select the Organization Name for your CRM instance
  3. For Entity Name, select Cases
  4. For Frequency, select 1 second.sentiment1
  5. Click New Step
  6. Click Add an actionla_step3
  7. Enter “Cog” into the search box
  8. Select Cognitive Service Text Analytics – Detect Sentimentla_step4
  9. Click into the Text box and select Description from the Dynamic Contentsentiment2
  10. Click New Step
  11. Click Add an actionla_step3
  12. Enter “Dyn” into the search box
  13. Select Dynamics 365 – Update a record
  14. Select the Organization Name for your CRM instance
  15. For Entity Name, select Cases
  16. Click into the Record identifier box and select Case from the Dynamic Contentrecordidentifier
  17. Click into the Case Sentiment Score box (i.e. the field that you created in Step 1) and select Score from the Dynamic Contentcasesentimentscore
  18. Click into the Case Title box and select Case Title from the Dynamic Contentcasetitle2
  19. Click into the Customer box and select Customer from the Dynamic Contentcustomer
  20. Click Save

Test and play

All that’s left now is to test it out.  Create a few services cases that have descriptions with different ‘happiness’, wait a few seconds for the Azure service to run, then refresh the Case page to see the sentiment score.

For example, this description…

“Customer felt let down by a late delivery, when they called in they were stuck in a ‘your call is important to us’ queue for 10 minutes.  Customer eventually hung up.”

…produces a Sentiment Score of 0.04, i.e. very unhappy.

and this description…

“Just want to say thanks for the super quick delivery, I really feel you went out of your way to get the parcel to me on time. Will definitely recommend your great service to my friends!”

…produces a Sentiment Score of 0.97, i.e.very happy.

Use cases

There are a couple of use cases I see for this capability.

The first is that once you have quantified sentiment into a score, the CRM system can easily identify customers who are not best pleased and move their cases into a higher priority queue. Dealing with these customers as quickly as possible is more likely to improve satisfaction.

The second use case is in pre-warning the case agent that the customer is unhappy before the agent makes any follow-up calls.  With a quantified sentiment score, a visual flag or status can be set on the CRM system to warn the agent that extra sensitivity is required, which once again is more likely to result in improved satisfaction.

daniel_good

Create ‘Connected Service’ cases in Dynamics 365 using an IoT device and Microsoft Azure

“Connected Service” is the name we give to the concept of a true end-to-end connection between an IoT-enabled device (such as a modern air conditioning unit or a lift) and the customer service department that would despatch an engineer to repair the device should it break down.

The idea is that, because the IoT device is both connected to the internet and has a bunch of sensors that monitor its temperature, vibration etc., it is able to send ‘health’ messages to its service department who can analyse this health data and decide whether to repair or swap out any given component.  This predictive maintenance model enables continuous, uninterrupted operations, without the down-time resulting from the more traditional and reactive break-fix service model.

This blog post explains how to set up a demo of this concept using a Raspberry Pi, Azure services and Dynamics 365.

Expected Outcome

wp_20161203_18_07_12_procopyOnce successfully set up, you’ll have a Raspberry Pi with a SenseHat that will trigger a service case to be created in Dynamics 365 when the humidity sensor reports humidity greater than 45%.

case

Connecting the dots…

This is the chain of events that will be triggered by breathing on the Pi’s SenseHat:

  1. Every second, the Pi SenseHat captures temperature, humidity and pressure data and sends this to an Azure IoT Hub
  2. The Azure IoT Hub authenticates the Pi device and ingests the data, making the data available for the next stage…
  3. …which is an Azure Stream Analytics job.  This job is a simple SQL query that says “if humidity is greater than 45, pass the data to an Azure Service Bus”
  4. The Azure Service Bus queues the data in a service bus message, ready to be picked up by…
  5. …an Azure Logic App that integrates to Dynamics 365 and creates a new Dynamics 365 Device Alert record that contains all the data sent from the Pi
  6. Once the Device Alert record has been created, Dynamics 365’s workflow kicks in to create the service case

Prerequisites

  1. A Raspberry Pi 2 or 3, and a SenseHat
  2. An Azure subscription
  3. A Dynamics 365 instance that has two custom entities (Device and Device Alert) that are related to the default Case entity as follows:iotdatamodel
  4. The Device Alert entity should be of the type: Activity
  5. The Device Alert form in Dynamics 365 should contain the following fields:devicealertform
  6. The Device form in Dynamics 365 can be a lot simpler.  The only field that’s really necessary is the Device Name itself, the rest of the fields are optional:deviceform
  7. The Raspberry Pi should have Windows 10 IoT Core installed, which you can download and install through the Windows 10 IoT Core Dashboard
  8. The Raspberry Pi should be provisioned in Azure through the IoT dashboard

provision

  1. The Raspberry Pi should be connected to your network
    1. You can connect from the default Windows 10 app once you’ve installed Windows 10 IoT Core
  2. And most importantly, you need to have followed Dimitris Gkanatsios’s fantastic blog article in order to create the Windows 10 app on the Raspberry Pi.  If you’ve successfully got to the “congratulations” and you’re sending data to Azure, you’re good to go!

NOTE: I made two minor additions to Dimitris’s code.

Firstly, in SenseHatData.cs I added two additional strings, DeviceName and DeviceGUID:

sensehat

Secondly, in MainPage.xaml.cs I added data for the Location, DeviceName and DeviceGUID strings:

mainpage_ink_li

You can get the GUID once you’ve completed Prerequisite 6 and have created a Device record for your Raspberry Pi.  To easily find the GUID, see this article here.

The reason for adding the GUID in the code (which does seem slightly odd) is because it’s required by the Get Record action in the Azure Logic App to lookup the Device record in Dynamics 365.  The Action requires the GUID rather than the Device record name because the record name may not necessarily be unique.

Step 1 – Create the Azure Service Bus queue

First off, let’s create the Azure Service Bus queue.

  1. In Azure, click New, then search for Service Bus, then click Create
  2. Enter a Name, Pricing Tier, Resource Group and Location, then click Create
    servicebus
  3. Click + Queue, give the queue a name, keep all the default settings, click Create
  4. Open the queue, click Shared access policies and add the following policies for Send and Listen
    sharedaccesspolicies
  5. In the main blade of the service bus, click Connection Stringsconnectionstrings
  6. Click RootManageSharedAccessKey
  7. Copy the Connection String – Primary Key and paste it somewhere safe to use later

Step 2 – Create the Azure Stream Analytics job

Now we’re going to create a new Azure Stream Analytics job, (i.e. in addition to the one created in Dimitris’s blog).

  1. In Azure, click New, then search for Steam Analytics job, then click Create
  2. Enter a Job Name, Subscription, Resource Group and Location, then click Create

streamanalyticsjob

  1. While the job is deploying, go to your IoT Hub, click Shared access policies and copy the primary key for the iothubowner role
  2. Once the job has deployed, click Inputs, click Add, complete the New Input blade as follows, click Create
    input2
  1. Now click Outputs, click Add, select Sink: Service Bus Queue, then complete the rest of the New Output blade as follows adding the details of the service bus queue you created in Step 1
    output
  2. Click Query, add the following T-SQL, click Savequery
  3. Start the job

Step 3 – Create the Azure Logic App and add the service bus connection

A Logic App is essentially a workflow tool with pre-built connectors to popular services, such as Twitter.  It’s an incredibly simple way to create system-to-system integrations without using complex code.  In this step we’ll create an Azure Logic App and connect it to our service bus.

  1. In Azure, click New, then search for Logic App, then click Create
  2. Enter a Name, Pricing Tier, Resource Group and Location, then click Create
    logicapp
  3. Click Blank LogicApp
  4. In the search box, enter Service bus to filter the list
  5. Select Service Bus – When a message is received in a queue (auto-complete)
  6. In the Connection Name field, enter the name of the service bus itself (NOT the connection name!)

servicebusconnection

  1. In the Connection String field, paste the string you copied in Step 1
  2. Click Create
  3. Once the connection has been created, enter the queue name you created in Step 1
  4. Set the frequency to 1 second
    la1

Step 4 – Add a Compose action in the Logic App

The below Compose action is necessary to remedy a known feature/bug within Azure whereby the JSON data in the service bus message gets inadvertently wrapped with an XML header.  This prevents the subsequent actions in the logic app from extracting the JSON payload, so we employ this Compose action to strip the message of the unwanted XML wrapper.

  1. Click + New Step, click Add an action
  2. Select Compose
  3. In the Inputs field add the following code:
@json(substring(base64toString(triggerBody()['ContentData']), indexof(base64toString(triggerBody()['ContentData']), '{'), sub(lastindexof(base64toString(triggerBody()['ContentData']), '}'), indexof(base64toString(triggerBody()['ContentData']), '{'))))

compose

Step 5 – Add a Get Record action to the Logic App

Now we use the Get Record action to perform a lookup on the Dynamics 365 Device entity. The logic app action will store the lookup values from the Device entity and enable us to pass them into the Device Alert entity.

  1. Click + New Step, click Add an action
  2. In the search box, enter Dynamics to filter the list
  3. Select Dynamics 365 – Get record
  4. Click Sign In and enter a Dynamics 365 administrator’s username and password to create the connection to Dynamics 365
  5. Select your Organization Name and Entity Name
  6. Click inside the Item Identifier field and from the Dynamic Content dialog select the Compose: Outputs blockitemidentifieroutputs

Our JSON has been stripped of its unwanted XML wrapper using the Compose in Step 4, and now we need to specify which JSON field/value pair to use in the Get record lookup.  We do this by using the code view to make one modification to the Outputs block.

  1. In the Logic Apps Designer menu bar, click </> Code View
  2. Scroll down to the Get_record: path linegetrecordpath
  3. At the end of the path line, add “.deviceGUID” immediately after outputs(‘Compose’) so that the line ends:deviceguid
  4. In the Logic Apps Designer menu bar, click Designer.  The Item Identifier field should now look like this, i.e. showing “deviceGUID”getrecord

Step 6 – Add a Create Record action to the Logic App

Now we’ve looked up our Device record, we can create a new Device Alert and populate it with our Device values

  1. Click + New Step, click Add an action
  2. In the search box, enter Dynamics to filter the list
  3. Select Dynamics 365 – Create a new record
  4. Enter the Organization name, Entity Name and Subjectcreatenewrecord2
  5. Click Show advanced options
  6. For each of the Location, Temperature, Pressure and Humidity fields in the new Device Alert record, add the Compose: Outputs block from the Dynamic Content dialog, e.g.:pressure

Once again, we need to modify the Output block code to tell the logic app which specific JSON field/value to add to the new record

  1. In the Logic Apps Designer menu bar, click </> Code View
  2. Scroll down to the section of code starting “Create_a _new_record”
  3. Under “body” you should see four lines of code for each of the Location, Temperature, Pressure and Humidity fields – similar to this:
"body": {
   "daniel_location": "@{outputs('Compose')}",
   "daniel_temperature": "@{outputs('Compose')}",
   "daniel_pressure": "@{outputs('Compose')}",
   "daniel_humidity": "@{outputs('Compose')}",
}
  1. At the end of the humidity line, add “.humidity” immediately after outputs(‘Compose’) 
  2. At the end of the temperature line, add “.temperature” immediately after outputs(‘Compose’) 
  3. At the end of the location line, add “.location” immediately after outputs(‘Compose’) 
  4. At the end of the pressure line, add “.pressure” immediately after outputs(‘Compose’) 
  5. Your code should now look similar to this:composealter
  6. In the Logic Apps Designer menu bar, click Designer.
  7. The fields should now look similar this:createnewrecord4
  8. In the Device field, add the Get Record: Device (Unique identifier for entity instances) block from the Dynamic Content dialogdevice

At this point you should have four actions in your Logic App, like this:finallogicapp

  1. Click Save

Step 7 – Create the Dynamics 365 workflow

Thankfully, all the hard work has now been done!  All that remains is to use a Dynamics 365 workflow to create the Case once the Device Alert has been created by the Azure Logic App.

(I won’t go into the detail of creating the workflow, I’ll assume you’ve got this part covered).

Basically, create the workflow with two steps:

workflow1

For step 1 “Create Case”, add these dynamic values to the Case form:

dynamicvalues1

and for step 2 “Set Regarding in Device Alert”, add this dynamic value to the Device Alert form:

dynamicvalues2

Conclusion

Assuming you’ve got this far (yes, I appreciate it was more of a transatlantic long haul than a weekend city break!) then all you have to do now is breathe on the SenseHat to get the humidity reading above 45% and you’ll get a nice list of new Device Alerts and associated Cases in Dynamics 365:

devicealertslist

i.e. Connected Service.

daniel_good

How to find the GUID of a Dynamics 365 record

If you need to find the GUID of a record in Dynamics 365, the quickest way to find it (without wasting time trawling through the F11 or F12 developer tools from whatever browser you are using) is simply to use Export to Excel.

From any View of your record, click Export to Excel.

excelexport
You’ll notice columns A, B and C are hidden.  Unhide these columns, Column A is your GUID!

guid

Broadcast Announcements in Dynamics 365

A small feature customers occasionally ask for is the capability for Dynamics 365 to broadcast news announcements in a highly visible screen location, so that all users cannot fail to read the post.

Ideally, the ability to broadcast messages using the same yellow system announcement bar would be perfect, but that’s not going to happen without introducing unsupported customizations.

yellowbar

The next best thing is to use Dynamics 365’s Announcements capability, which although does not broadcast its messages in the navbar, does allow you embed the announcements as a Web Resource in any dashboard or form.  This gives you plenty of options to highlights announcements in the screens users are most commonly going to use: Opportunity form, CSR Dashboard etc.

Full instructions on the easy setup are here, with results looking like this:

announcements1

Of course, the customers that ask for this capability usually aren’t aware of Yammer – which is a far better alternative overall!