AI Generated Image of Simon Doy with thumbups and elements related to Azure, Model Context Protocol, AI, Copilot Studio and Containers displayed in the background.

How to: Build a Custom MCP Server with the .NET MCP SDK, host as an Azure Container and connect to Copilot Studio


This blog post is part of a series of blog posts on my experiences building Custom Model Context Protocol Servers.

You can find the first in the blog post series, I would recommend starting there.

Unfortunately, it got a little delayed by my having a heart attack at the end of August. Fortunately, I am very lucky. Did you know that if you have a heart attack outside of a hospital, you have a survival rate of 8/100.

Fortunately, I was at home with my kids, and one of them was with me when I collapsed. They got help from my heroic neighbours, who did CPR and brought me back to life. This, plus the fact that I live in a city which is only 10 minutes away from a hospital, meant I was one of the fortunate ones who was only down for 5-6 minutes, and within 3 hours I had had heart surgery and was in ICU.

What I will say to you, dear reader, is if you have what you think is heartburn, but it only comes on when you do exercise, please go and get checked out by the doctor.

I am still on the road to recovery, but I am very happy to be here and say that I am making good progress, and it’s time to start blogging properly again.

This document describes the trials and tribulations of building an MCP Server for LinkedIn with the .NET MCP SDK Toolkit and then hosting it in Microsoft Azure using the Azure Container Instance Service. This MCP Server is then going to be integrated into a Copilot Studio-built Agent. I will show you how to do this; unfortunately, I cannot share the code for the LinkedIn integration.

So, let’s get started.

High-level Steps

The following high-level steps are required to deliver an MCP Server.

  • Build MCP Server
  • Test Locally
  • Configure Visual Studio to build the container container
  • Set up Azure Container Repository
  • Set up Azure App Service
  • Publish and Deploy Container
  • Test, Debug with MCP Inspector
  • Setup MCP Server
  • Connect MCP with Copilot Studio Agent

Build your MCP Server

So, the first step is to build your MCP Server. It does not really matter what your MCP Server does, but using the details that I provided in my previous post. I built my MCP Server using the following:

  • Fire up Visual Studio 2022
  • Created a new Console Project called i365.LinkedInMcpServer
  • Added the following packages
    • Microsoft.AspNetCore.Authentication.JwtBearer – Version=”9.0.11”
    • Microsoft.Extensions.Hosting – Version=”9.0.11”
    • Microsoft.Identity.Web Version=”4.1.1″
    • ModelContextProtocol Version=”0.5.0-preview.1″
    • ModelContextProtocol.AspNetCore Version=”0.5.0-preview.1″
  • I added a Program.cs file and configured it so that a WebApplicationBuilder class was created and contained the configuration of the WebApplication.
  • You can see the code in the associated GitHub Repo.

I edited the project file and added several configuration items to set up container support for the project. This allows me to create a container which will be used to host the MCP Server. I wanted to be able to test this locally before pushing up to Azure, which is made easier by being able to run the process locally.

 <PropertyGroup>
    <EnableSdkContainerSupport>true</EnableSdkContainerSupport>
    <ContainerRepository>ithink365/ithinkexamplemcp</ContainerRepository>
    <ContainerRegistry>[container-registry-prefix].azurecr.io</ContainerRegistry>
    <ContainerFamily>alpine</ContainerFamily>
    <ContainerRuntimeIdentifiers>linux-x64;linux-arm64</ContainerRuntimeIdentifiers>
    <ContainerBaseImage>mcr.microsoft.com/dotnet/sdk:9.0</ContainerBaseImage>
    <UserSecretsId>[UserSecretId]</UserSecretsId>
  </PropertyGroup>

Next, for the tools that are provided by the MCP Server.

The MCP Server tool that I am showing here is a simple tool which will echo what has been sent to it. My real example was building a set of tools for LinkedIn for my company, but the approach is the same.

Create a class which will host your MCP Server and decorate it with the following attributes.

  • McpServerToolType attribute is applied to the class.
  • McpServerTool attribute is applied to each function that exposes a tool to the MCP Server.
  • Description attributes are important and allow you to describe to the calling MCP Client what the function does and also what the role is for each parameter that is passed into the function.

Once you have set up your MCP Server Tool and configured the Program.cs you are ready to go.

The first time I fired up the MCP Server in Visual Studio, I hit a couple of issues, which I will cover now so that you don’t have the same problem.

Firstly, I needed to redirect the ports for the MCP Server that it was listening to on this port to allow the MCP Server to listen to the requests.

You can read about how to resolve this in the Port Fun section below.

Debugging and Testing your MCP Server

Currently, the best way to test the MCP Server is by using MCP Inspector. However, I am having some challenges using MCP Inspector when testing MCP Servers that are protected by Microsoft Entra ID. This is due to an issue with how the MCP Protocol is described vs how the OAuth Protocol has been described. The MCP Protocol is stricter than the underlying OAuth protocol, and this is causing an issue because of how Entra ID works with resource parameters.

I have found that using the following to run MCP Inspector from a PowerShell script works well.

  • npx @modelcontextprotocol/inspector dotnet run

Once you have your MCP Server running in your Visual Studio debugger and your MCP Inspector running, you can test it by connecting your MCP Inspector to the /mcp endpoint provided by your MCP Server when running locally.

To connect MCP Inspector to your MCP Server, simply

  • fill in the URL, ensuring you have added the right port and /mcp on the end.
  • Click Connect
  • Click on Tools in the middle and click List Tools.
  • Your tools should load up after a few seconds, and now you can test each tool.

Deploying to the Cloud

Once you have tested your MCP Server, then you are ready to deploy it to the Cloud. You will need a few bits of Azure Infrastructure including the following:

  • Resource Group
  • Azure Container Registry
  • Azure App Service
  • Storage Account

Setting up the Azure Infrastructure

Create a new resource group in your desired location. Add the Azure Container Registry, Azure App Services and Storage Accounts.

Make note of the Azure Container Registry Connection String.

As you will need that to configure the publishing of your container from within Visual Studio.

Port Fun

It turns out that MCP Servers running on Azure Containers are not happy on the default ports provided, as they run on 8080.

To resolve this, I used the following environment variable to change the container to use a different port. This fixed my problems.

To set up the environment variables, do the following. This can all be done within Visual Studio:

  • Open up your MCP Project file
  • Create the following environment variables in the project file
<ItemGroup>
   <ContainerEnvironmentVariable Include="ASPNETCORE_URLS " Value="http://*:8080;http://*:5000" />
   <ContainerEnvironmentVariable Include="ASPNETCORE_HTTP_PORTS " Value="8080;5000" />
 </ItemGroup>

Fun and Games with SSL and Azure Container Instances.

The original intention was to use an Azure Container Instance and access it directly. I had to come up with another plan because the process of setting up SSL within an Azure Container Instance just seemed too painful.

Change of Plan

The issue with setting up SSL and Azure Container Instances gave me some food for thought.

The challenge is how to ensure that all communication is through an encrypted channel.

When we are running our MCP Server as an Azure Container Instance, it means that we have to have SSL termination happening within the container. Which looked really painful from a couple of different sides, in particular, having to keep updating the SSL certificate when it expired was going to add some admin overhead. So, I looked at different options.

We could implement containers behind Azure API Management as an option.

Additionally, we could use Azure Functions, which was going to be my preferred option. However, at the time of setting this up, I hit an issue where, because we want to integrate MCP with Copilot Studio, we had a mismatch with protocols.

Azure Functions, at the time of writing this blog post, only supported SSE, and Copilot Studio has just deprecated SSE support.

I can see that updates have been made to the MCP Azure Functions SDK, and they now support Streamable HTTP. I am going to take a look at this next.

My other option was to use an Azure App Service to host the container. It turns out that this works well and solves my problem with SSL. At the same time, I can still use the container that I had already built and tested to host the MCP Server. Result!

So, what was involved?

Deploying MCP Server

It is possible to deploy the MCP Server from Visual Studio using the publish feature.

To deploy the container into the Azure Container Repository, do the following

  • Ensure that you have set up the Visual Studio Project settings as mentioned above.
  • Right-click your Project that hosts your MCP Server
  • Choose Publish
  • Click New Profile
  • Choose Azure
  • Click Next
  • Choose Azure Container Registry
  • Click Next
  • Choose the Azure Container Registry by selecting the Azure Subscription.
  • Choose the Azure Container Registry that you have created.
  • Click Next
  • Choose .NET SDK (No Dockerfile required)
  • Click Finish

Now you are ready to publish your Container to the Azure Container Registry.

  • Ensure your Azure Container Registry is selected
  • Click Publish

Wait for the deployment to be completed.

Setting the MCP Server as an App Service

I created an Azure App Service within the same resource group and Azure subscription that the Azure Container Registry had been provisioned within.

To configure the Azure App Service:

  • Click on Deployment -> Deployment Centre
  • Add a Custom Container and configure the container to be loaded from the Azure Container Registry by choosing the Azure Subscription and Image.
  • Set the Port to 8080
  • Click Apply.

This will deploy the Azure Container to the Azure App Service.

Updating your MCP Server

Of course, one of the challenges is updating the MCP server because we are deploying and updating a container; we need to notify the host of the MVP Server, the App Service, that an update has taken place. I really should look at how to automate this. Maybe I will once I have published this.

The steps are to go to the Deployment Centre on the App Service and click the Sync button. Acknowledge the warning that things will be updated and let it take place.

In my experience, this takes a bit longer than you think. After 5 minutes or so, your MCP server will be up to date. I have started to tweak the description of the MCP server so that I can see when it’s updated.

If you use the MCP Inspector, you will start to experience delays in the MCP connecting, and that is the sign that it has been updated.

Updating your MCP Server with Azure DevOps Pipelines

I have found that updating through publishing via Visual Studio is a little unreliable at times. I believe that this is down to authentication issues when authenticating with the Azure Container Registry.

To resolve this, I have built a series of Azure DevOps Pipelines, which are used to build and release the MCP Server to the Azure Container Repository and then deploy to the Azure App Service.

I will document this in a separate blog article.

Linking the MCP Server to Copilot Studio

The process of hooking up and making the MCP server available to Copilot Studio is much easier than I was expecting.

First, we need to create a custom connector which will be targeted at the MCP server.

  • Browse to https://make.powerapps.com
  • Click on Custom Connectors
    • If you cannot see Custom Connectors, click More and then Discover All
swagger: '2.0'
info:
  title: Contoso
  description: MCP Test Specification, YAML for streamable MCP support in Copilot Studio
  version: 1.0.0
host: contoso.com
basePath: /
schemes:
  - https
paths:
  /mcp:
    post:
      summary: Contoso Lead Management Server
      x-ms-agentic-protocol: mcp-streamable-1.0
      operationId: InvokeMCP
      responses:
        '200':
          description: Success

  • Click on New Custom Connector
  • Choose Import an OpenAPI File
  • Choose your mcp-server-openapi-schema.json file
  • Click Continue
  • Ensure your Host URL is correct and the same as your Azure App Service.
  • Ensure Base URL is /
  • Give the MCP Server a suitable name and description.
  • Click Security

For this MCP Server, there is no authentication, but we should have authentication, preferably with Entra ID OAuth 2.0 style authentication.

I will be putting together a guide to enable an MCP Server with authentication in a future post.

  • Click Definition
  • Update the Summary Name
  • Update the Connector Name
  • Check the URL for the Invoke MCP command

Once happy then

  • Click Create Connector

Once we have created the custom connector, we can consume it through our Copilot Studio Agent. The process of publishing the connector will create a new MCP tool within the Power Platform environment, and then this can be added to your agent.

  • Browse to your Agent
  • Click Tools
  • Click “Add a Tool”
  • Choose your MCP Server
  • You will be asked to create a connection to the MCP Server.
  • Click Add and Configure
  • After a short period of time you should see a list of tools appear.
  • Now enable/disable the tools that the MCP Server provides.

Recently, there was an update so you can choose which tools are available to your Agent, so enable the ones you want and disable the ones you don’t want.

Now you can publish and test your agent.

Try it out.

Of course, you will need to make sure your agent is enabled to use the MCP tool and has orchestration enabled.

Conclusion

In this blog post, I explained how to set up an MCP server to be hosted in a secure and encrypted environment using Azure App Service. I also explained how to deploy and update the MCP Server, and finally test it and connect it up to Copilot Studio.

I’d love to hear how your MCP Server experiences are going and if you found this useful.

My plans for the next post in this series are to build and deploy an MCP server with Azure Functions, explain how we can deliver Azure Dev Ops pipelines to build and release our MCP Server container, discuss authentication so that you can secure your MCP Servers and finally, I want to look into the recent update to Declarative Agents which now support Model Context Protocol!

Check out the Microsoft Dev Blogs Post.

My Adventures in building and understanding MCP with Microsoft 365 Copilot


So, I have been following the Model Context Protocol (MCP) world for a while now. I first heard about MCP just as we were going out to MVP Summit in March 2025.

Already, the Microsoft Copilot Extensibility team were on the case with people like Fabian Williams experimenting with them. I have been following this space, reading articles and finally, over the summer, I have had some time to roll up my sleeves and look at how I would build an MCP Server. Primarily with the aim of making it available to Microsoft 365 Copilot via Microsoft Copilot Studio and the Microsoft 365 Copilot extensibility world.

This article will be part of a blog series that describes the trials and tribulations of building an MCP Server.

The MCP Server I wanted to build was for a small demo that I wanted to create. The aim was to bring together Multi-Agents and MCP. The goal to create a solution that allows a marketing person to create a Marketing Campaign which describes a story for an ideal client and then allows the the creation of social media content on LinkedIn.

The idea was that we would have four Agents

  • Marketing Campaign Agent
  • Social Media Content Creator Agent
  • LinkedIn Posting Agent
  • Marketing Content Quality Assurance Agent

The plan was to make these agents available through Microsoft 365 Copilot and build them using Microsoft Copilot Studio. Multi-Agent support was launched at Microsoft Build 2025 in May and was made available to us in June 2025.

My first step was to sit down and started to do some investigation. I needed to answer questions such as:

  • How do we host MCP Servers?
  • How do we secure them?
  • How do we build them, deploy them, debug them?

Research

Like all good developers / solution architects / vibe coders …. I needed to get stuck in and we know we should research things first. Well, I ignored that for about an hour and then I thought I better understand how to build things before going any further.

So, I did a bit of researching and found a great article on building MCP Servers which were hosted within Aspire by Oleksii Nikiforov, here is the link to his posts.

From these posts I learnt a bit more about Aspire (which I have heard a lot about but never tried) and MCP Inspector which I had not heard about but quickly got to grips with.

The tutorials that Oleksii has put together are great and I quickly had an MCP Server running through Aspire which I could connect to with MCP Inspector.

Microsoft Product Groups are busy writing a number of different frameworks to build MCP Servers and the one that has a lot of momentum behind it is the MCP .NET SDK, https://github.com/modelcontextprotocol/csharp-sdk

The other framework that caught my attention, is the Microsoft Azure Function MCP Server Framework, which can be found on Github, https://learn.microsoft.com/en-us/samples/azure-samples/remote-mcp-functions-dotnet/remote-mcp-functions-dotnet/

I must admit I really like the idea of MCP Servers with Azure Functions. There are some great videos of how to build MCP Servers with Azure Functions and we will delve into them a little bit later.

However, from the research that I did it seemed that most people were building MCP servers using Containers, so I thought I will start there with the .NET SDK and using Oleksii’s approach.

There was quite a bit to learn which I will talk about next and then in the next blog post I’ll delve into building out the MCP server with the different approaches.

The final bit of research that I did was read about the MCP specification here, I will be honest I read it and got a bit more of an idea, but those RFC documents are hard work.

However, the MCP website is much nicer and easy to understand, so here is a link to the MCP Specification, https://modelcontextprotocol.io/specification/2025-03-26/basic

Microsoft 365 Copilot was quite good at giving me an overview of the protocol.

 Overview of MCP Protocol

MCP is built on JSON-RPC, using UTF-8 encoded messages for communication between clients and servers. It supports multiple transport mechanisms, allowing flexibility depending on deployment needs.

To understand the relationship between the different components have a read of the lifecycle process for the Model Context Protocol, https://modelcontextprotocol.io/specification/2025-06-18/basic/lifecycle.

MCP and Authentication

MCP and Authentication has been evolving and an area which was missing at the initial launch of MCP is now defined. I suspect that this will change and evolve with feedback.

I found the following guide really useful to understand Auth and its direction from this post by Den. Of course, these posts are going to be great. Den is one of the core maintainers of MCP and has some great articles and insights as to the design decisions.

OAuth In The MCP C# SDK: Simple, Secure, Standard · Den Delimarsky

https://den.dev/blog/mcp-csharp-sdk-authorization/

MCP Inspector

First, let’s talk about some tools and we should start with the MCP Inspector (https://github.com/modelcontextprotocol/inspector). This tool seems like the go to tool when testing out MCP Servers. I am sure there are more out there and I will be doing some research into those tool as well.

However, the tool looks like this:

The MCP Inspector allows you to integrate your MCP Server which is great, it supports Authentication via OAuth2 or Bearer Token.

Additionally it supports the main MCP Server Transports which will talk about shortly.

The solution that Oleksii has put together embeds a version of MCP Inspector and makes it easy to use. However, I found that this was an older version and got into the habit of using the following command to run the latest version of MCP Inspector from the cmd line.

npx @modelcontextprotocol/inspector dotnet run

I’ll be honest I do not remember using npx (Node Package Execute) before, but it has been around for a while. It is an amazing tool which is part of the npm-cli and npm package (Node Package Manager). It enables Node.js packages to be executed directly from the npm registry.

The other advantage of using npx to run MCP Inspector is that you can see what the MCP Inspector is up to more easily as it outputs logs to the command line.

MCP Transport Types

One of the first things that I needed to get my head around was the different MCP Transport types. These different communication protocols are used to enable MCP in different scenarios.

Let’s talk about these next.

STDIO Transport

This is the most lightweight and direct transport method.

  • How it works: The client launches the MCP server as a subprocess.
  • Communication:
    • Messages are sent via stdin and received via stdout.
    • Only valid JSON-RPC messages are allowed—no embedded newlines.
    • Logging (if any) is done via stderr.
  • Use case: Ideal for local development or tightly coupled systems where simplicity and low overhead are key

STDIO Transport allows a local MCP Client to instantiate and run a local MCP Server and talk to it through the command line. This is great for local MCP Clients like Visual Studio Code and Github Copilot, Claude etc


SSE (Server-Sent Events)

This was the original streaming mechanism used in earlier versions of MCP.

  • How it worked:
    • Clients would initiate an HTTP connection and receive a stream of server messages via SSE.
    • It allowed for real-time updates without polling.
  • Limitations:
    • SSE is unidirectional (server-to-client only).
    • It lacked flexibility for more complex bidirectional communication.
  • Status: Deprecated in favour of Streamable HTTP as of protocol version 2025-03-26

This is currently the transport of choice for MCP Servers built on Azure Functions, which caused me problems and made me rethink that approach. I know that the Azure Functions team will be working on resolving this issue.


Streamable HTTP (Current Standard)

This is the modern, flexible transport replacing SSE.

  • How it works:
    • The server runs independently and handles multiple clients.
    • Clients send JSON-RPC messages via HTTP POST requests.
    • The server can respond using either standard HTTP responses or SSE for streaming.
  • Security Considerations:
    • Servers must validate the Origin header to prevent DNS rebinding attacks.
    • Local servers should bind to localhost only.
    • Authentication is strongly recommended.
  • Use case: Best for scalable, production-grade deployments where streaming and multi-client support are needed

This is the current flavour of the week and if you are building MCP Servers that are going to run over a network then this is the approach you should be taking.

MCP Client

We are nearly at the end of this blog post, and I have not really talked about the MCP architecture and to be honest there are some great resources out there that do this. However, we need to talk about the main parts to an MCP ecosystem. The MCP Client is the consumer of MCP Servers. The MCP Inspector is an example of an MCP Client it can connect to an MCP Server, discover the resources, tools and how to authenticate from the MCP Server.

I can see that more and more tools will have MCP Clients built in to allow them to consume MCP Servers and use their capabilities.

For more information on the MCP Client, read https://modelcontextprotocol.io/specification/2025-06-18/client/roots

MCP Server

The MCP Server is part of the MCP architecture which exposes, resources, tools and prompts via the MCP primitives. They operate as independent components and should be built with a focused set of capabilities.

I am really fascinated to see how the protocol evolves to handle the challenges with different authentication approaches and types but this all happens and is described by the MCP Servers.

Fundamentally though the MCP Clients learn what is available for them by discovering the resources and tools when they interrogate the MCP Server.

Conclusion

In this blog post I set the scene for what I have been up to with my adventures into the Model Context Protocol space. I have tried to document my journey and resources that I have discovered. I talk about some of the components and tools and link to the resources that I hope you find useful.

In the next blog post I am going to talk about my experiences with building MCP Servers with the MCP .NET SDK and delve into different hosting models and the challenges with them as you look to build secure and encrypted MCP Servers.

Please connect with me on LinkedIn and Bluesky and would love to hear how you are getting on with building MCP resources.