Copilot Studio Governance

Published by Valentin Mazhar on , last updated on

Share

We saw a rebranding of Power Virtual Agents into Copilot Studio and an infusion of Gen AI into the product. Both tend to produce mixed feelings within Power CoEs and Admins. On one hand, this is exciting and opens many possibilities. On the other hand, companies need to catch up with Copilot Studio Governance, and even more widely with AI Governance. The platform evolves quickly, and the governance-related capabilities are covered in different pages in the official documentation, not always easy to find. I try centralize most of it in this post!

Image of the move from PVA to Copilot Studio

Main Capabilities and Scope

Copilot Studio is rich in functionalities. Prior to discussing the tools available to govern it, let’s briefly surface the main capabilities of this product. A lot more information is available on the official documentation. The Copilot Studio Implementation Guide also provides great and comprehensive guidance.

Core Capabilities

At its core, Copilot Studio is a low-code chatbot creation platform. Whether we call these “chatbots”, “virtual agents”, or more recently “standalone copilots”, they describe an interface allowing a user to chat with a robot. Let’s cover the core Copilot Studio capabilities, in a nutshell:

  • Topics: the Maker can design the conversational flow with “Topics”. Topics correspond to the intent of the user, detected with the help of trigger phrases. Topics comprise different “nodes” which define the path of the conversation. For example we can create a Topic about “Holidays”. Users would trigger this topic automatically when they’d ask about their holiday allowance or a specific type of leave. The nodes would then take them through providing additional details before responding with the expected answers or actions.
  • Nodes: there are different types of Nodes available. They include Send a message, Ask a question, HTTP Request, Call a cloud Flow, and more.
  • Integration with Power Automate: the integration with the HTTP Request and Cloud Flows nodes offers a lot of possibilities. Indeed, the bots can perform actions leveraging the thousands of connectors available via Power Automate.
  • Channels: Users can interact with the chatbots from diverse channels such as Microsoft Teams, Facebook, and more.
  • Authentication: Different authentication methods are possible, from none at all to Entra ID, Teams Only and custom OAuth2 identity providers.

Generative AI Features

The core capabilities have been around for a while, and were already there during the time of Power Virtual Agents. What is new and is gaining popularity is the set of Generative AI features now available in the platform.

  • Generative Answers: the creator of the Bot can configure some resources (public sites, SharePoint sites, or documents). It will then be able to query and respond base on the related content. Makers can combine it with the Fall Back Topic. This way, if the bot cannot answer a specific question it will browse through the configured resources to respond.
  • AI-Based Copilot Authoring: it is the same principle as with M365 Products Copilots or other Power Platform Maker Copilots. Makers can design the chatbot by conversing with the platform using Generative AI.
  • Generative Actions and Plugins for Copilot Studio: once turned on, Makers can create plugin actions with descriptions about their intended use. The chatbot will automatically identify when it should use a plugin action in the context of the conversation. Makers can build such plugin actions with Power Automate, connectors, or custom code with  Copilot Framework skills.
  • M365 Copilot Conversational Plugins: this feature allows to customize the Microsoft Copilot. Such plugins allow users to integrate with third party tools. For example to allow users to ask about their IT support tickets from within Microsoft Copilot chat.

Licensing

There are two main types of licenses relevant to leverage Copilot Studio Standalone Copilots:

  • A user license (“Microsoft Copilot Studio User License”): this license is free and is to author Copilot Studio bots. Without this license assigned, users will only be able to create Trial bots which will eventually expire.
  • A tenant/capacity license: this license applies at tenant level to provide some capacity. Every time a user chats with a bot, it consumes some of this capacity. It is a similar as SharePoint or Dataverse capacity models. There are two main types of capacity licenses for Copilot Studio:
    • Chat Sessions: this is the old model which has been deprecated at the end of 2023. This license provides 2K chat sessions per month, across all bots. A chat session is a conversation comprising multiple messages with a single user.
    • Messages: this is the new model, providing 25K messages per month. This models counts each message sent by the bot as 1 unit of the 25K. If the bot leverages Gen AI to respond with a message, it is consumes 2 units.

Since recently, admins can assign these capacity units (chat sessions or messages) to specific environments (same principle as AI Builder).

Customizing Microsoft Copilot with plugins only requires Copilot licenses for the users consuming the plugins, so there does not seem to be any specific license for that.

Scope

For this post I will exclude the Copilot Studio Capabilities allowing to customize M365 Copilot, and will focus on the Copilot Studio Governance related to standalone copilots. I do not present the wider platform governance capabilities here neither, but here is another article where I share some general advice to govern the platform, or here where I share some key tenant restrictions to know about.

DLP Policies for Copilot Studio

Admins can restrict Copilot Studio with Power Platform DLP Policies, one of the key components for Copilot Studio Governance.

Configuration

6 configuration options specific to Copilot Studio can be classified as part of the Power Platform DLP Policies:

  • Chat without Microsoft Entra ID authentication: when blocked on an environment, users cannot interact with the chatbot if authentication has not been configured for the bot. It can be any type of authentication.
  • Direct Line, Facebook, Microsoft Teams and Omnichannel: if these channels are blocked on an environment, Makers will not be able to configure their bots on them and interacting with an existing bot from that environment would fail on these channels.
  • Skills with Copilot Studio: if this is blocked Makers will not be able to leverage Microsoft Bot Framework skills for the bots on that environment.

If the HTTP Connector is blocked on an environment, the related Node will not be available neither from Copilot Studio. All the other connectors can also be blocked which would result in not being able to trigger Cloud Flows from a chatbot using these connectors.

Enforcement

Something very important is that blocking these Copilot-Studio-specific options in a DLP Policy does not do anything by default! In my opinion, this would deserve some warnings in the DLP Policy interface and clearer communication in the documentation. Essentially, a tenant admin has to run the below PowerShell script in order to enforce the configuration of Copilot Studio options in the DLP Policies.

Set-PowerVirtualAgentsDlpEnforcement -TenantId <tenant ID> -Mode <Mode>

<Mode> can take three values:

  • Disabled (default value): Blocking the Copilot Studio options in the DLP Policy will have virtually no impact on Makers nor on Users.
  • SoftEnabled: Blocking the Copilot Studio options in the DLP Policy will have no impact on existing bots and their users. Any bots already published with the softly blocked configuration will continue to work. However for Makers:
    • Makers will see the below error messages when they will access their bots in the Copilot Studio portal. This error message will be the same as if it was a strict enforcement, which is a little confusing as Makers don’t have a way to know from the message only whether their users have an impact or not, unless they try it themselves. These error messages are not always specific which, which is also confusing…
    • Makers will not be able to publish updates on their bots or to create new bots if they do not comply with the applied DLP Policy. This forces them to make their bot compliant while avoiding to impact their already-published-bots users.
    • Makers will not be able to select the blocked options when configuring their bots, they will be greyed out.
      Screenshot of the greyed out options in Copilot Studio Portal once the enforcement is softEnabled
  • Enabled: the only difference with the SoftEnabled mode is that any non-compliant bot will then start failing when the users interact with it. Users will then see errors such as:
    Screenshot of an DLP error showing in a Copilot Studio chat

Additional Comments about DLP Enforcement for Copilot Studio

  • If admins block the “Chat without Microsoft Entra ID authentication” option, the Maker will not even be able to test the bot in the Copilot Studio portal, unless it has an Authentication method. However, just having the “Only for Teams and Power Apps” authentication selected will be enough for the Maker to be able to test the bot from the Maker portal.
  • Blocking the different channel options in the DLP Policy do not prevent the Maker from being able to test the bot in the Maker Portal.
  • There is the option of adding a parameter “-OnlyForBotsCreatedAfter” in the PowerShell command to enable the enforcement only for the bots created after a certain date. This can be helpful for a transition towards a strict enforcement.
  • I had issues during my tests when blocking the Direct Line channel in a DLP Policy. Since the Demo Site channel is on by default, it seems to be throwing a conflict and prevents me from publishing the bot, without any way to disable the demo channel. I have a ticket open in progress with Microsoft to follow up.
  • There is also the possibility of exempting a bot from the applied DLP Policies. I would suggest to be careful with DLP Policy exemptions since they can easily make things messy and confusing for admins.

Tenant Setting to Prevent Bot publication with Generative AI

Configuration

There is a tenant setting available to block the publication of bots using Generative AI. This setting is available from the Power Platform Admin Center for Tenant Admins who can also change it with PowerShell.

Screenshot of the Power Platform Admin Center and Copilot Studio setting to block the publication of bots using Gen AI

The PowerShell script to turn off this setting:

$set = Get-TenantSettings
$set.powerPlatform.intelligence.enableOpenAiBotPublishing = $false
Set-TenantSettings -RequestBody $set

Once this setting is off, Makers will no longer be able to publish updates or new bots if the Generative AI features are in use:

Screenshot of the error message when trying to publish a bot with Gen AI if it is turned off at tenant level

However Makers will still be able to configure and test the Gen AI features from the Copilot Studio Portal with the embedded chat.

Scope

What exactly does Microsoft consider a Gen AI feature for this setting? There does not seem to be a clear answer from Microsoft on the documentation, but from my testing it seems like it includes all the below features:

  • The “Boost conversational coverage with Generative answers” setting in the “Settings”>”Generative AI” tab
  • The “Dynamic chaining with generative actions” setting in the “Settings”>”Generative AI” tab
  • The use of the “Advanced” > “Generative answers” node when configuring a topic
  • The use of Connector or Plugin Actions with Dynamic Chaining (triggered from the context and not from a Topic). However there is no impact for the use of Connector Actions from within a topic.

It means that if the publication of bot with AI is turned off with this setting, Makers will not be able to publish bots which are using any of the above features.

⚠️Warning about the Enforcement

As explained above, once the setting is turned off, Makers will not be able to publish updates or new bots leveraging these functionalities. What about the bots already published? Well… It seems to not impact already-published-bots-with-AI. In other words, if a bot using Generative answers is already published, users will still be able to use it indefinitely, even if the tenant setting to publish bots with AI is turned off. Not ideal for Copilot Studio Governance… This setting only allows to prevent additional bot publications leveraging AI, whether they are for updates or new bots.

Also, unfortunately at the moment this setting is only available at tenant level, which does not give any possibility for a more granular control at environment level.

My wish list for Microsoft 🎁would be to make this setting as part of the DLP Policy options. This would allow to manage the enforcement for existing bots, a more granular control per environment while preserving the control of Tenant Admins (as opposed to the traditional environment settings which can be changed by any environment system admin…).

Visibility and Monitoring for Admins

A key aspect of Power Platform and therefore Copilot Studio Governance is to be able to monitor what is the platform usage.

What is not there (yet?)

Let’s start by what there isn’t: there is nothing in the Power Platform Admin Center which allows to monitor Copilot Studio usage, for now:

  • The Tenant Analytics only provide information about Apps, Flows and Dataverse, but nothing about chatbots.
  • When opening a specific environment in the Admin Center, the Resources can only list Power Apps, Flows, Power Pages and D365 Apps.
  • There is not any consumption reports available for download at tenant level to see which environments are consuming capacity.

Now that this is out of the way, let’s look at what we can do!

The data structure

Let’s first take a look at the data structure. Every chatbot created in Copilot Studio is stored in the hosting Environment. The information is mostly available in the bot and botcomponent tables (display names respectively Chatbots and Chatbot subcomponents).

  • The bot table contains the core information about each chatbot. For example it contains the display name, whether it is published, as well as the configured authentication method.
  • The botcomponent table stores all the components used by the bot such as the associated Topics and Variables.
Screenshot of the tables storing the chatbot information for copilot studio

Whenever a user has a conversation with a bot, the conversation is stored in the conversationtranscript table (Conversation Transcript). By default, a process runs every day to delete any transcripts older than 30 days, but admins can customize this (more information here).

The CoE Kit as your best Copilot Studio Governance buddy

As usual, the CoE Starter Kit is way in advance compared to what is available in the Power Platform Admin center. At present, the sync flows of the kit browses through each environments to inventory all the bots created. The data collected includes:

  • The bots general information (name, creation date, owner, environment, etc…),
  • Aggregated counts of conversation, which can be helpful to track the capacity usage (for the chat sessions model).
Screenshot of the CoE Kit dashboard for Copilot Studio

There are a few limitations:

  • The kit sync flows do not track the configuration of the bots (Authentication, etc…) neither. I have raised this Github idea to ask if they could include it and it seems like this will be the case for the May release, which will be a big help for admins.
  • The bot usage is limited to the count of chat sessions but does not include the count of messages, which means it is not easily possible to measure the capacity consumption for the model based on messages.

Custom approach

Since the data is stored in each environment hosting a Copilot Studio bot, nothing stops us from creating custom flows, scripts, or Power BI dashboards to connect to all environment dataverse tables and report on the related data. For companies refusing to use the CoE Starter Kit, it might be the viable option at the moment. That said, I tend to suggest to leverage the CoE Starter Kit and raise Ideas on Github to ask them to cover what Admins need. The CoE Starter Kit team does an outstanding job works 100% on the topic. It would be sad to not leverage all this work, especially when we consider that most Power Platform CoEs I have met are understaffed….

Prevent Bot Creation

We have seen how to block the publication of new bots with Gen AI features, how to block some specific configurations in published bots with DLP Policies, and how to monitor the usage of Copilot Studio with the CoE Kit. All these things are essential for a strong Copilot Studio Governance. Now… How can we even prevent a user from being able to create a chatbot in the first place? You might have an interest in preventing users from using this platform, or in enforcing the use of some specific environments for Copilot Studio.

At tenant level if Self-Service sign up is disabled

If the possibility for users to sign up for Trials is disabled, users should only be able to create chatbots if they have been assigned the Copilot Studio User license. Controlling who has this license can help to control who can create chatbots.

If users can sign up for Trials themselves, then this method will not be sufficient as they will be able to sign up for a Trial and create bots anyway. In this great article, Thibault Joubert explains how admins can disable this option.

At environment level with Security Roles

Security Roles can grant permissions over the Chatbot table. By removing permissions over that table from Makers, they will not be able to create bots on that environment. The Environment Maker role being commonly used to grant Makers access to environments, customizing it to remove the privileges on the Chatbot table could be an option.

Instead of manually customizing a role on each environment, it is possible to:

  • Create an unmanaged solution with the customized role in a Source environment,
  • Deploy it to multiple target environments with the Solution Deployer which I shared in a former post.

Delete Bots as Admins

Another critical aspect of Copilot Studio Governance is to know how we can delete bots. There are multiple situations when an Admin might want to delete bots, and unfortunately as of today there is not a suitable way for large organizations. I raised an idea for it here about 2 years ago but sadly never heard back. There are a few things which we want to delete:

  • The chatbot information in Dataverse,
  • The Azure Application which is automatically provisioned when a bot is created.

I found two ways to achieve this, however they are not ideal…

With the Dataverse Bound Action PvaDeleteBot

With the Dataverse connector, there is a Bound Action for the Chatbot table: “PvaDeleteBot”. Since the modern Dataverse connector allows to connect to other environments, I thought this had been the solution I was looking for!
Screenshot of the Dataverse Bound action PvaDeleteBot
It does indeed remove the bot from Dataverse and the Copilot Studio portal. Unfortunately it does not remove the App in Azure. From a governance and keeping-things-tidy perspective, this is not ideal.

With the Dataverse API

Looking at the network trace from the Copilot Studio portal, I found that the POST request below is responsible for the bot deletion:

https://{OrgApiUrl}/api/data/v9.2/bots({BotId})/Microsoft.Dynamics.CRM.PvaDeleteBot?tag=deprovisionbotondelete

It seems like the parameter “tag=deprovisionbotondelete” is what we are after since it actually also deletes the associated App in Azure. Unfortunately I couldn’t find a way to use this tag with the Dataverse Bound Action. There are two main ways in which admins can make this HTTP request automatically from Power Automate:

  • By using the HTTP connector: with this approach, the Admins will need to provision an Application User for each target environments so that they can get the appropriate auth token. It can be manageable if there are only a few environments. It does become a lot more complex with hundreds or thousands of environments and might require an additional automated application user creation system… EDIT: I have now shared a solution which allows admins to delete bots and manage the App Users programmatically, you can find it here.
  • By using the HTTP with Entra ID connector: with this approach, the Admins can create a connection with a tenant admin account on the target environments and make the request without an Application user. However here again the admins will need to set up the connections for each environment…

⚠️ Please note that although this API Call works, I did not find any documentation about it and so is very likely to be non-supported by Microsoft. To use at your own risks…

Recommendations

The perfect Copilot Studio Governance does not exist, it will always depend on the context of the organizations. Still, here are a few recommendations if you would like to leverage Copilot Studio:

  • If not done already, do consider the overall platform governance first and set up some key tenant restrictions. Capabilities like tenant-isolation can go a long way.
  • Consider the overall Operational Model to govern the platform. Having clear roles and responsibilities within the organization will be instrumental to govern a product like Copilot Studio and all the popularity it is gaining, especially for large organizations.
  • Involve the AI Governance team or the team the closest from being responsible for it. They might have some inputs with regards to Generative AI usage in the context of a chatbot.
  • Be clear on retention requirements for the conversation transcripts and ensure that the least level of privilege is in place on any environments where Copilot Studio is available. System Administrators will be able to see all the transcripts of all bots on their environments which could lead to privacy concerns.
  • Do leverage the CoE Kit for visibility or at least implement a custom solutions to monitor the inventory and usage. Having a tool available to users with blind admins is a recipe for disaster…
  • Block the Unauthenticated bots with DLP Policies on all environments by default. Exposing internal data anonymously and publicly is the number one risk with Copilot Studio. You can always define a exception process to allow specific use cases on dedicated environments if needed.
  • Consider limiting the use of Copilot Studio on the default environment. By blocking all the Copilot Studio options in the DLP Policies for example, you ensure that the users will not be able to publish bots on that environment, while they will still be able to use them from the Copilot Studio portal for personal productivity and exploration.

That’s it! I hope this post will be helpful to establish a Copilot Studio Governance strategy, let me know if you think of anything else!


Share
Categories: Governance

4 Comments

Chris Wanja · May 7, 2024 at 9:01 pm

Hey Valentin! Great article. An update on deleting bots – both the Azure App Registration and related row in Dataverse is removed now 😊

    Valentin Mazhar · May 9, 2024 at 8:49 am

    Hi Chris! Thanks for your comment – do you mean that the Dataverse Bound Action “PvaDeleteBot” does now also delete the the App Registration?
    Just gave it a go with the Dataverse connector from Power Automate and the App registration was not removed…

      Chris Wanja · May 15, 2024 at 5:15 pm

      Sorry, I should have been clearer in that if the user removes their bot those resources are removed. Not if an admin performs the action. I would be curious if you could extend your flow and an Azure API call to remove the application registration that matches the bot name? Have not looked into it that closely.

        Valentin Mazhar · May 15, 2024 at 5:19 pm

        I see! Yes the flow could probably delete the Azure App via API, but I suppose it would then require permissions over Azure and make it less accessible for Power Platform Admins…
        I just published an alternative here: https://powertricks.io/delete-copilot-studio-bots-as-admin/
        let me know what you think!

Leave a Reply

Avatar placeholder

Your email address will not be published. Required fields are marked *