*** Update 1/29/2019 – I’ve been looking at the newer Identity dlls, and managed to get Message Center working but couldn’t get to Planner (yet) – so if you are using this code best to stick with https://www.nuget.org/packages/Microsoft.IdentityModel.Clients.ActiveDirectory/2.29.0 and not the v3 or v4. I’ll update if I get v4 working ***
TL;DR version – Create a plan with buckets for you products, get an AppId for a Native App that can read SHD and read/write all groups – then update the various variables in Application Settings for the Function or directly in the PowerShell – and create your custom products.json. If you need more pointers – read on…
*** Update 10/28/2017 – new sample code – corrected hard-coded tenant ID – added as a variable or appplication setting depending on the script. You can find it by going to the Admin Portal, then the Admin Center for Azure AD. then the Properties item under Manage – and the Directory ID is the GUID you are looking for. ***
*** Update 10/24 – I should have mentioned – works just fine if your plan is one hosted as a tab in Microsoft Teams too! Find your PlanId using Graph Explorer ***
I’ve commented before on the challenge that some of our Project and Planner customers have in keeping up with changes in Office 365 – particularly when they are not the Global Admin or service administrator and are not able to see the Message Center – and when their Global Admin isn’t always seeing the relevance or importance of the messages posted – so here is a potential solution! How about an automated way that reads the messages and posts them to Planner – automatically placing the ones you are interested in to their own bucket and assigning to the person who owns that work stream. Interested? I thought so. As a final enticement – here is what we will end up with.
The example I’ve put together is just a sample – no support or warranty, and there are a few different ways you might want to use it or configure it – and as I walk through the configuration I’ll point out some of the ideas I had along the way. I decided to use Azure Functions – one to read the message center posts (on a timer), which then writes my chosen messages to an Azure storage queue and then a second Azure Function is triggered which reads from the queue and creates the Planner tasks. It would also be possible to do all this in one PowerShell script from your desktop if you wanted to. The function and script would need to run with the identity of an admin who can access the Message Center, so if you are reading this and can’t get to Message Center then you’ll need to work with your IT team and/or Global Admin to make this all happen. Tell them I sent you.
In my example I am interested in posts about Project, Planner, SharePoint, Skype, Yammer and Teams, and I have these associated with the resources in my team. I am using the title of the message center post and matching on my products – and if two products are mentioned then I put a task in each bucket (See the Skype for Business integration in Yammer task in the above screenshot). I did wonder about having an ‘other’ bucket too so that nothing got lost that just didn’t happen to mention the product of interest in the title. I decided to hold these relationships in a json file and hard code the Ids for the related buckets and the resource I was going to assign – just to avoid additional work in the function to pull static information. Likewise I hold the Id of my Plan in the application settings. One possible variant on this would be to also hold the Plan Id in the json file if you wanted to spread the different messages across different plans.
In this post I’ll just be getting things set up and pasting in the PowerShell and getting things working – I plan on a follow up to dig deeper into the actual PowerShell and some of the choices I made.
Step 1 – Create a Plan and capture some constants
Nothing special needed here, just a blank plan with members added and buckets created for your desired products. I’ll create a new one as I walk through – and mine will be called Office 365 Change Management. I’ve set it to Public and checked the subscribe option so my members will get e-mails.
I add my members and buckets.
and now is a great time to capture my planId – which can be seen at the end of the Url:
One other thing I need to do for setup is to set the Category names. As this is just a one-off I didn’t code it in the function – as it is global to the Plan (but you could add as a ‘first run’ option I guess). I just created a test task in my Planner and set the categories to match some of the flags used in the Message Center. Even after deleting this test task the categories are set for any other tasks added to the Plan.
I also need the Ids of my buckets and resources – and for these the easiest way is to use the Graph Explorer. If you log in to your appropriate Office 365 tenant and then make a Get call v1.0 to a Url like this one https://graph.microsoft.com/v1.0/Planner/Plans/<your plan id>/buckets that will return a list of your buckets and the piece you need is the id. Worth saving off the full response to a file.
To get the members you can get the ‘owner’ field for the Plan (this is actually the Group Id) using
and then use this in the Group Graph call for members
https://graph.microsoft.com/v1.0/Groups/<the owner id from the plan>/members
Here is my first member – and I’ve highlighted the id for Kari.
Step 2 – The products.json file
Now I have my products, buckets and resource ids I can create my json lookup file that I’ll be using from PowerShell. The format is as follows – obviously your ids will be different and feel free to choose different products – this file drives the selection, placement and assignment of messages to tasks.
Step 3 – We need to get an Application Id from Azure AD
An Application Id, sometimes called an AppId or a ClientId is a code from Azure that has certain permissions associated with it and is then used, along with your credentials, to get a token that allows you to do stuff with the various APIs. In my scenario we will be hitting both the Office 365 Service Communications API as well as the Graph API. I’ll be using the same AppId for both and setting the necessary permissions but you could also use two different AppIds as the different APIs are used by my two different functions. To get an Application Id you need to navigate to the Azure AD Admin Center for your Office 365 account (which may not be the same account as your Azure Portal account). From the Office 365 Admin Portal navigate to the Admin Centers list in the left navigation and Azure AD is usually found at the bottom.
Once you land at aad.portal.azure.com you click on App registrations option where we will create a new App registration and get our Id.
Clicking on New application registration at the top of the App Registration blade and enter a name for your App (it can be anything) and make sure you select Native, and the Redirect URI can be your O365 tenant. The Create button is at the foot of the blade (and move the focus to another field if if isn’t active).
Next select the App registration you just created and copy the Application ID from the Essentials pane for use later – and click All settings so we can give it some permissions – by selecting Required permissions on the next blade.
I’ll be setting all permissions required on this single App ID – but if you wanted to split then then the function that writes the plans needs one more added to the Windows Azure Active Directory API – of Read and write all Groups (sign in and read user profile will already be selected). and then you will also need to click Save, and then Grant Permissions (and Yes to the dialog) in the left blade. Granting permissions for a Native App is akin to the dialog you will have no doubt seen in some web apps where you have to acknowledge that the App can do stuff for you before it takes you to the App itself.
For the other permissions required to read the message center posts we will be adding another API by selecting the Add option and Select an API and choose Office 365 Management APIs. From the list of delegated permissions we only need to select the top one (as I write this anyway) Read service health information for your organization. Worth scanning the others just to give you an idea what other data this API may allow you to read for other applications. Select, Done and then Grant Permissions will finish our step (and you did copy the Application ID didn’t you?). Our permissions page showing the API we just added should look like this:
Step 4 – Prepare our Azure Function App
(If you’d rather just run the PowerShell from your desktop – and you have the permissions then you can skip the Azure stuff – meet me again at Step 5 where I’ll be walking through the actual PowerShell code in the functions)
If you already have an Azure subscription then you are good to go – or you can create a free account if you just want to try things out. You get a $200 credit (or local equivalent) and you also get a free amount of execution time each month (400,000 GB-s) and 1 million executions. There is also a small cost for the storage account which also gets used for the queue. https://azure.microsoft.com/en-us/free/
Once you have a subscription then Click New in the upper left and in the Compute section you will find Function App. Once clicked it will prompt you for a name (which needs to be unique across Azure – I usually prefix things I’m working on with my alias. You also select your subscription and either create a new or use and existing resource group (I’m using a new one – it makes cleaning up easier that I can just delete the resource group when I’m finished – if this is a keeper for you then you might want to use an existing). I’m going with the consumption plan – West US location and a new storage account and finally turning on Application Insights – which gives some useful telemetry on your working system. Check Pin to dashboard so you can easily find your Function App again and click Create.
It just takes a few minutes to deploy and will show on your dashboard when it is ready. In subsequent steps we will be adding our actual functions but first we will configure several application settings – basically some of the variables that you need that are easier to pull in to the function rather than have to hard code wherever you need them. Once the blade for your new Function App opens you will see a section bottom right labelled Configured features – and we are headed to Application settings to set the settings for our application:
There are a bunch of settings already configured – we will be adding the following:
*** added tenantId 10/28/2017 – You can find it by going to the Admin Portal, then the Admin Center for Azure AD. then the Properties item under Manage – and the Directory ID is the GUID you are looking for. ***
Variable Value Purpose
aadtenant <yourtenant>.onmicrosoft.com The tenant we are working with
aad_username <yourname>@<yourtenant>,onmicrosoft.com The login we will be using
aad_password *************************** Password for above
clientId <your application id from step 3> Identify our app and get the permissions
messageCenterPlanId <the PlanId of the Plan from Step 1> Make sure we write to the right plan
tenantId <see above> Identifies your tenant for the api calls
In my case as I am the only user of my Azure account I’m ok with a clear text password being saved (and it is only a demo tenant) – there are ways to store and use an encrypted version – searching the web should find some solutions – including Azure Key Vault. I found tis example https://blog.tyang.org/2016/10/08/securing-passwords-in-azure-functions/ – Once you’ve added all the entries there is a Save option at the top of the page. Other settings will get added as we create our functions.
Step 5 – Our Azure Functions at last!
To create our first function – which will be the one that reads the messages based on a timer (hourly for starters) and writes to a storage queue we first hit the + sign next to Functions:
There is a new Wizard for premade functions – but for the PowerShell one I am creating I’ll go for the Custom function option in the lower part of the screen:
I choose PowerShell in the language dropdown (but take a look at the other options while you are there) and the middle one is the one for us – TimerTrigger – PowerShell. I’ll call it ReadMessagesOnTimer and set the schedule to hourly with the cron format of 0 0 * * * * and click Create (Daily may be fast enough in production – although if you were using this to read service health data and not messages then hourly may be appropriate. See https://codehollow.com/2017/02/azure-functions-time-trigger-cron-cheat-sheet/ for a good cron reference for Azure Functions.
The initial script just outputs a timestamp of when the function was executed. We have some other settings to add before we paste in the script.
If we go to the View files tab on the right there are a couple of files we need to upload. One is the products.json file we created earlier, and the second is a dll that we need for the Azure (adal) authentication. This is called Microsoft.IdentityModel.Clients.ActiveDirectory.dll and can be found in C:Program Files (x86)Microsoft SDKsAzurePowerShellServiceManagementAzureServices by default – and if you don’t see it then you probably need the Azure SDK https://azure.microsoft.com/en-us/downloads/. Upload both of these files.
Next in the Integrate section under our function we will add a new output – for writing to our storage queue. After we click New output we can choose Azure Queue Storage from the panel and click Select.
For the Message parameter name and Queue name I’ll leave the default, and for the Storage account connection I’ll choose AzureWebJobsStorage from the dropdown – then click Save. And while I’m here I’ll also Create a new Function triggered by this output by clicking Go,
Again I will choose PowerShell from the Language dropdown – name my function WriteTaskToPlan and click Create.
My second function also needs to have the dll uploaded but doesn’t need the products file as I do all the selection and tagging in the first function and write that out to the queue.
Now for the fun stuff – pasting in the PowerShell code itself and running! The code is attached in ReadMessageOnTimer.txt and WriteTaskToPlan.txt. We can start with the first one first – paste it in and there shouldn’t need to be any need for edits unless you have deviated from my naming – one potential change is in the path to the uploaded documents – which includes the name of the function – so edit as necessary. You can run this just using the Run option – and you will see an exception first time as the queue gets created if it doesn’t exist – and it will populate the queue with messages. If I navigate back to the Integrate section for my WriteTaskToPlan function (which we haven’t pasted in yet, unless you are ahead of me) and change the Queue name to the real one I am using “message-center-to-planner-tasks” we should quickly see that our queue gets drained – assuming you have a way to monitor the queue! This is where Azure Storage Explorer comes in handy – see the foot of the https://azure.microsoft.com/en-us/downloads/ page under Standalone tools.
In this screenshot I managed to see some of the messages before they were picked up by the 2nd function (which isn’t actually creating tasks yet).
The storage account is my brismitho365mcstore and I can see my message-center-to-planner-tasks queue. We can also monitor function activity via the Monitor section under each of the functions – and this shows the ‘empty’ function pulling messages from the queue:
If I now overwrite the current contents (2 lines) of my WriteTaskToPlan function with the contents of WriteTaskToPlan.txt – updating the paths to the files as necessary if the name isn’t the same as mine – then I can save and I should now have a working system. I can either wait until the top of the hour – or just run the ReadMessageOnTimer function manually to check that all is working. Just to show some of the debugging capabilities I ‘forgot’ to load the dll – so in my case I had a number of failures (it tries each queue message 3 times before giving up) and here you can see the really useful log info that shows that my dll wasn’t found (Thinking back I probably could have loaded it at the wwwroot level to serve both functions too):
With my dll uploaded and another manual run of my function we are in business!
And to show what the tasks look like when I drill in – lets take a look at a rich content message (Actually a test sent just to my tenant with a Planner title – MC123579 – Planner: test format targeted post (we have a bug currently that the thumbnail for the image isn’t showing) – but first this is the original message in the Message Center – complete with a product link and a YouTube video:
Then in Planner we don’t have the richness so instead I trawled the text for Urls and added them as attachments – so the person who is assigned the task in Planner can still review the content (and the potential target audience here is people who can’t access the Message Center as they are not global admins remember). You can also see here the categories set from metadata in the message.
Step 5 – “Azure Functions are not something I want to play with right now”
That turned into a pretty long blog – even if it is mostly pictures – and I really wanted to step through the code too to explain what I was doing – as I’m sure even if you don’t want to do this precise thing there may be parts that are useful – but I’ll save that for another blog.
This last part is taking the logic of the two functions and just putting them into one PowerShell that you could run from your desktop (assuming the right permissions, the Office 365 PowerShell connectivity – see https://technet.microsoft.com/en-us/library/dn975125.aspx and also the Azure SDK stuff https://azure.microsoft.com/en-us/downloads/ (PowerShell is listed half way down in the command-line tools).
You will still need to do Steps 1 to 3 to create your Plan, get your bucket and resource info and create your products.json – and register an ApplicationId – and then the PowerShell is in the attachment – MCtoPlannerFull.txt. You will need to edit the various constants – all identified by <something that needs editing> type text. The main differences are references to local files – not going via a queue, and looping round the entire set of messages as it writes them out.
Hopefully the copy/paste of the text hasn’t broke anything – but always worth looking for the quotes or dashes – just in case they have been changed by one of the editors I’ve used!
This is just a sample – and there are many ways you could change things about – from pulling the other message types, such as service information, or writing out to other applications like Yammer or Teams – that is the beauty of the Graph APIs – once you get familiar with them they open up a whole world of applications. And maybe the Message Center will get the PowerApp/Flow treatment – and make this even easier!
Graph API – https://developer.microsoft.com/en-us/graph/
- Office 365 Service Communications API reference (preview) – https://msdn.microsoft.com/en-us/office-365/office-365-service-communications-api-reference
- Azure Functions – https://docs.microsoft.com/en-us/azure/azure-functions/
The scripts are attached in a zip file – along with a sample products.json – and the scripts also contain this disclaimer:
Sample scripts are not supported under any Microsoft standard support program or service. The sample scripts are provided AS IS without warranty of any kind. Microsoft disclaims all implied warranties including, without limitation, any implied warranties of merchantability or of fitness for a particular purpose. The entire risk arising out of the use or performance of the sample scripts and documentation remains with you. In no event shall Microsoft, its authors, or anyone else involved in the creation, production, or delivery of the scripts be liable for any damages whatsoever (including, without limitation, damages for loss of business profits, business interruption, loss of business information, or other pecuniary loss) arising out of the use of or inability to use the sample scripts or documentation, even if Microsoft has been advised of the possibility of such damages.