Setting up for Power Platform Dev Ops Pre-requisites


This blog post is part of a series of blog posts based on my experiences building a set of Dev Ops processes for deploying Power Platform solutions using Azure Dev Ops.

I recommend starting there if you have not read the first post in the series.

This post will discuss the pre-requisites required to set up Power Platform Dev Ops. Then it will show you how to setup these pre-requisites.

So, let’s get started!

Firstly, the process to deploying Power Platform solutions into an environment relies on a couple of things:

  • An identity to authenticate with the Power Platform environment
  • Access to the Power Platform environment
  • A connection to the Power Platform environment for Azure Dev Ops

The identity that we are going to use is an Azure Active Directory Application. If you do not know what one of these is, then read this. Briefly, it is like a service account and represents an application within Azure which can be given permission to access resources in Microsoft 365 and Microsoft Azure.

I will refer to this Azure AD Application as the Power Platform Deployment Engine. This is the terminology that we use at iThink 365.

Secondly, this identity needs to be able to access the Power Platform environment, so we will discuss the steps to do that.

Finally, our Azure Dev Ops environment that is going to be running the build and release processes needs to be able to connect to the Power Platform environment using the Power Platform Deployment Engine. In Azure Dev Ops this is achieved using a service connection.

Setting up the Power Platform Deployment Engine (Azure Active Directory Application)

To set up the Azure AD Application, do the following: 

  • Choose the Azure Active Directory resource 
  • Click on Application Registrations 
  • Add a new Application Registration, click New Registration 
  • Fill in the details as shown in the screenshot below 
  • Name: Power Platform Deployment Engine 
  • Supported account types: Single Tenant
  • The redirect URL is optional. At iThink 365 we set up it up as our website,
  • Click Register when ready 
Steps to setup the Azure AD Application for the Deployment Engine

Next, set up the permissions that the app required: 

  • Browse back to the application 
  • Click API Permissions 
Setting up API Permissions

  • From the API permissions page 
  • Click Add permission 
  • Choose from the right-hand task pane “Dynamics CRM” 
  • Choose delegated permissions 
  • Choose user-impersonation 
  • Click add permission Graphical user interface, application, Teams

Description automatically generated 
  • Add another permission, this time for Microsoft Graph 
  • Find the User group 
  • Choose User.Read 
  • Click Add permission 

Once the permissions are set up then the permissions need to be granted for the tenant using admin consent. 

To achieve this you need an account with the Global Admin role assigned. 

  • From the API permissions screen
  • Click Grant admin consent for [tenant name] 
  • Sign-in and consent to the application. 

The last step is to create a client secret. 

  • Click Certificates and Secrets 
  • Click New Client secret 
  • Fill in a description and set the lifetime for 2 years 
  • Make note of the client secret that has been created as you will need it later.

The Azure AD Application configuration is complete. 

Setting up access to the Power Platform environment

The next step is to give the Power Platform Deployment Engine access to the Power Platform environment.

The process to do this has the following steps:

  • Add an application user to the Power Platform environment.
  • Set permissions for the environment for the application user.

Let’s get started.

To do this do the following: 

  • Click on the Environment name
  • Click Settings
  • Click Users + Permissions to expand
  • Click Application Users
Accessing the application users

From the application user screen click New app user.

  • Click Add an app
  • Search for the Power Platform Deployment Engine app
  • Select the app and click Add App
Add an application user
Search, find the deployment engine and add the application
  • Click security roles
  • Choose System Customizer
  • Click Save
Select the security roles
  • Click Create

The deployment engine now has access to the Power Platform environment. We have given the deployment engine the minimum access that we can however there are times when the deployment engine needs more permission. The time that I have seen more permissions is if security roles are deployed in solutions. If that is the case then give the deployment engine the System Administrator security role.

The setting up of the deployment engine will need to be repeated for each Power Platform environment that we are going to deploy to but also the environment that we are developing our solutions in.

Therefore, repeat the set up process for each of the Power Platform environments.

With all the Power Platform environments setup, the final step is to connect our Azure Dev Ops environment to the Power Platform environment using the deployment engine.

Connect Azure Dev Ops to the Power Platform Environment

The last step is that Azure Dev Ops is connected to the Power Platform environments using our Deployment Engine.

We will make an assumption that you have an Azure Dev Ops Project Collection setup already. You will also need to be an Azure Dev Ops Project Administrator for the project.

  • Browse to your Azure Dev Ops environment.
  • Browse to your project.
  • Click Project Settings.
  • Choose Service connections 
  • Click new service connection 
  • Choose Power Platform 
  • Click Next 
  • Fill in the server URL which is called the Dynamic URL from the Power Platform environment you are connecting to. 
  • See Getting the URL to your Dynamics environment section above 
  • Fill in your tenant id (which is found by going to Choose Azure AD -> Properties) as the directory id 
  • Fill in the application id for the Power Platform Deployment Engine 
  • Fill in the client secret for the deployment engine which you made note of before. 
  • Fill in the name of the service connection.

We use the following naming convention for our service connections so that it is easy to see the different service connections.

  • Use Power Platform [Release Stage] Environment ([])
    • e.g. Power Platform Development Environment (
    • e.g. Power Platform Test Environment (
    • e.g. Power Platform Test Environment (
    • e.g. Power Platform Production Environment (
  • Click Save 

Repeat the service connection setup for each Power Platform environment. 

  • Click Save to complete the configuration of the service connection

You will now see the service connection for the Power Platform you have just created.

Repeat the process to connect to all the Power Platform environments that need to be deployed to.

Getting the URL to the Power Platform environment 

To get the URL for your Power Platform environment do the following:


The steps that we have been through have created all the pre-requisites for deploying Power Platform solutions using Azure Dev Ops.

In the next article, we will go through the process of setting up the build and release pipelines.

Connecting SharePoint Online to On-Premise Databases with SharePoint Framework (SPFx)


Recently we seem to be getting involved in projects after they have gone awry. One of our partners reached out asking for help with an issue they had whilst migrating a SharePoint 2010 environment to SharePoint Online.

The problem was that the customer made extensive use of Dataview Webparts with their SharePoint 2010 environment. These web parts displayed important information used by the business. The Dataview Webparts connected directly to a SQL server hosted within the customer database.

Of course when the web parts were migrated into the cloud, they did not work as they were unable to connect to the SQL Server. This ended up stopping the migration until a solution was found.

The customer did not want to move the data into the cloud due to several reasons but the most important one being all of the other systems dependent on it.

Several options were discussed which will be discussed in the next session.

The approach

So several options were looked at, and as moving the database was quickly ruled out, we came up with these two:

  • Build a solution with PowerApps and use the On-Premise data gateway.
  • Build a solution with SharePoint Framework using a REST API use Azure Hybrid Connections

Whilst the PowerApps solution would take less time, the licensing cost of the PowerApps solution ended up ruling it out due to its total cost of ownership (TCO).

So, the SPFX solution was chosen. The architecture was to use SPFX webparts which connected to a REST API hosted in Azure App Services. The clever part was using Azure App Service Hybrid Connections which allowed us to connect from Azure back into the customer network without the need to reconfigure complex firewalls.

To help visualise the solution, let’s take a look at the architecture.

We ended up having two Azure Hybrid Connection Services running. One for the active live environment and another for the disaster recovery environment.

The data being accessed was sensitive so the REST Api had to be secure. It was configured so that it was backed by Azure AD using an Azure AD Application, implemented with OAuth authentication using JWT Bearer tokens. The SPFX connected to the REST API and authenticates using Open Id Connect to ensure that only authenticated and authorised users can access the API. Further protection was provided by setting the API behind Azure API Management.


For the SharePoint Framework web parts to be able to authenticate with the REST API there are a couple of steps that need to be performed:

  • Configure the SharePoint Framework solution to request permission to access the REST API
  • Authorise the request made by SharePoint Framework to access the REST API.

To configure the SharePoint Framework solution take a look at this Microsoft post which provides a good guide (see section Configure the API Permission Requests).

The second part is performed by going into the SharePoint Admin centre and approving the request. Now the point to make is that the user accepting the request needs to be a SharePoint Administrator and also grant admin consent to Azure AD Applications. Basically a Global Admin has this role, so we worked with the IT team to ensure a privileged user did the authorisation. Be mindful of this when deploying to the customer as that will take some discussion and time to organise!

The screen to authorise SharePoint Framework solutions.

Another point to make here is that the name of the Azure AD Application configured in the SharePoint Framework needs to be map to the name of the Azure AD Application configured in Azure AD. This is configured as mentioned above in the SharePoint Framework solution. When I first looked at this I set the resource to be the resourceid for the Azure AD Application rather than the name of the application.

Hopefully mentioning this will mean you do not waste your time getting this right.

"webApiPermissionRequests": [
        "resource": "HROnline Api",
        "scope": "user_impersonation"

Azure Hybrid Connections

The Azure Hybrid Connection is setup in two places.

  • Azure App Service hosted in the cloud
  • Hybrid Connection service – running as a Windows server within the network.

The hybrid connection service establishes a connection to the Azure App Service through Azure Relay which is built on top of Service Bus.

Diagram of Hybrid Connection high-level flow

To setup the Hybrid Connection in Azure App Service you must be running at least the Basic Tier or above.

There are some limitations to the types of connection that the technology supports. The transport mechanism needs to be TCP based and does not support UCP. For this solution a .NET SQL Client was used which is supported and works really well.

For information on setting up the Azure Hybrid Connection see the following Microsoft article.


One of the areas that we wanted to ensure was the performance of the applicatoin. So we put together a POC was put together to prove the approach and also check performance. The performance has been very good and provided that the REST ApI is developed with some thought, it performed better than expected.

There was plenty of thought that went into the API. A few of the optimisations we made were

  • making sure that we had support for paging and limiting the number of records retrieved at one time.
  • Using Dapper and performing filtering at the SQL layer rather than pulling the data down and filtering in the API


This solution enables SharePoint Online solutions to access data hosted On-Premises and it does work really well. To be honest we were surprised how well the solution performed.

Most importantly, he partner and customer were really happy with the end result too.

I hope that people find this post useful, if there is an aspect that you would like more information on then leave a comment and I’ll see what I can do.