Part 2 – Build & Ship a “Docs Agent” to Microsoft Teams

(Companion guide to “Spin-Up the Microsoft Learn MCP Server”)

Make sure you have read and setup the Docs MCP custom connector from part 1

  1. What you’ll build
  2. Prerequisites
    1. Icons to Download (optional)
  3. 1 – Create the Agent in Copilot Studio
    1. Add Suggested Prompts
    2. Agent Settings
    3. Turn Off Pointless Topics
  4. Publish & Package for Teams
    1. Submit Agent for Approval
  5. Approve Agent App (As a Teams Admin)
  6. How to Use the Agent
    1. Adding Agent to a Meeting or Chat
    2. Troubleshooting

What you’ll build

A Copilot Studio agent that queries the Microsoft Learn MCP server for live docs, then answers teammates inside a Teams chat or Meeting.

Prerequisites

NeedNotes
Docs MCP custom connector from Part 1Already in your environment.
(https://flowaltdelete.ca/2025/06/26/how-to-spin-up-the-microsoft-learn-mcp-server-in-copilot-studio/)
Copilot Studio (preview) tenantGenerative orchestration enabled. (Early Features)
Teams admin rights (or approval from your Teams Admin)To upload a custom app or publish to the org.
Copilot Studio LicenseMessage packs or sessions
prereq table

Icons to Download (optional)

Below are icons you can use for the Agent and the MCP custom connector.

1 – Create the Agent in Copilot Studio

In this example I am going to use the existing agent I created from Part 1.

  1. Modify or create the agent with a meaningful name, description, and icon.
    (You can use the one I provided from above or use your own)
  2. Name: MS Docs Agent
  3. Description: MS Docs Agent is your on-demand mentor for Microsoft technologies—built with Copilot Studio and powered by the Microsoft Learn MCP server. Every answer comes from the live, authoritative docs that Microsoft publishes each day, so you never rely on stale model memories or web-scraped content.
  4. Orchestration = Enabled

  5. For your Instructions for the agent, we don’t want to add too much. After much testing I found that in its current state the Docs MCP server handles the instructions well and having too much instructions causes the response to fail. So its better to leave instructions blank for now.
  6. Web Search – This should be Disabled. We only want the agent to query the docs which it does through the MCP server.
  7. Knowledge should be empty, the only thing we want this agent to do is query the Docs MCP server, so this should be the only Tool that the agent has access to.
  8. To recap, the only Tools and Knowledge this agent should have is the MCP Server (custom connector) that we created in the first blog post. If you need help setting this up refer to Part 1.

Add Suggested Prompts

When users interact with the agent in M365 chat (Copilot) we can show suggested prompts to help guide the user in what is possible with this agent. Here are a bunch of samples you can give your agent:

TitlePrompt
Dev Env for Power AppsSet up a developer environment for Power Apps—step-by-step.
Rollup vs FormulaRollup fields vs Formula columns in Dataverse—when to use each?
Flow 502 FixPower Automate flow fails with 502 Bad Gateway—how do I resolve it?
Cert Path FinderFastest certification path for a Dynamics 365 functional consultant.
PL-200 Module ListList every Microsoft Learn module covered by the PL-200 exam.
Managed Env EnableTurn on managed environments and approval gates in Power Platform.
Finance DLP PolicyBest-practice DLP setup for finance data in Power Platform.
Power Fx Date FilterSample Power Fx to filter a gallery to today’s records.
OpenAI Flow SampleMinimal example: call Azure OpenAI from Power Automate.
Secure Env VarsSecure environment variables with Azure Key Vault in flows.
Pipeline ChecklistChecklist to deploy a solution through Power Platform pipelines.
PCF Chart ControlBuild a PCF control that renders a chart on a model-driven form.
New PA FeaturesSummarize new Power Apps features announced this month.
Preview ConnectorsList preview connectors added to Power Automate in the last 30 days.
Explain to a ChildExplain Dataverse to a five-year-old.

You can only add 6 Suggested Prompts. So choose carefully.

Agent Settings

Next we want to configure some settings on the agent.

  1. Click the Settings button on the top right.

  2. (Optional) If you want the agent to have reasoning capabilities > Under Generative AI turn on: Deep reasoning
    **Note that this is a premium feature**

  3. Scroll down to Knowledge, make sure Use general knowledge and Use information from the Web are both OFF

  4. Make sure to click Save once done.

Turn Off Pointless Topics

Next we will turn off the topics we don’t want the agent to use.

  1. Click on Topics tab > Under Custom > Only leave Start Over topic On.

  2. Under System > Turn Off:
    – End of Conversation
    – Escalate
    – Fallback
    – Multiple Topics Matched

  3. Next, lets modify the Conversation Start to make it sound better.
    Click Conversation Start topic > Modify the Message node:

  4. Click Save.

Now we are ready to Publish and Package for Teams!

Publish & Package for Teams

Next we need to Publish our agent.

  1. Click on the Channels tab > Click Publish

  2. Once your agent is published > Click on the Teams and Microsoft 365 Copilot channel.

  3. A sidebar opens > Check the Make agent available in Microsoft 365 Copilot > Click Add channel.

  4. After the channel has been added > Click Edit details.

  5. This is where we configure the agent in Teams. We will modify the icon, give a short description, long description and allow for the agent to be added to a team and meeting chats.
    Under Teams settings > Check both:
    Users can add this agent to a team
    Use this agent for group and meeting chats

  6. Click Save

Submit Agent for Approval

Now because we want our organization to easily find and use this agent. We will submit the agent to the Agent Store. To do this follow these steps:

  1. First Publish your agent, to make sure you have the newest version you are pushing to Teams admin for approval.
  2. Next click on the Channels tab > Select the Teams and Microsoft 365 Copilot channel.
  3. Now click Availability options.

  4. Now we will configure the Show to everyone in my org.

  5. Than click Submit for admin approval.

    Now we will look at what a Teams Admin has to do.

Approve Agent App (As a Teams Admin)

A Microsoft Teams Admin will have to approve the Agent app before your org can use it. As a Teams Admin follow these steps:

  1. Navigate to https://admin.teams.microsoft.com/policies/manage-apps
    (Click on Manage apps under Teams apps)
  2. Search for your agent name in the search bar

  3. Click the agent > Publish.

  4. Note:: You will need Admin Approval each time you want to publish an update to the agent.

How to Use the Agent

Once your agent is approved by an Admin. You can easily find it in the Agent Store. Another easy way to get to your agent is to open it from Copilot Studio:

  1. Click Channels tab > Select Teams and Microsoft 365 Copilot channel > Click See agent in Teams.

You will be brought to Teams with the agent open. You can now add it:

Adding Agent to a Meeting or Chat

There are a few ways to add the agent to a meeting. One easy way is to @mention the agent in the chat.

**Note to start typing the name of the agent, and it should show up**

Troubleshooting

There are a few things to note that I ran into:
1) If your getting an error on the MCP Server, remove all custom instructions

2) Sometimes your agents details can be cached and showing old metadata. In this case you can resubmit the app approval.

3) Always test the Agent inside Copilot Studio Test Pane with tracking topics and Activity Map turned On.

Add the Microsoft Learn Docs MCP Server in Copilot Studio

Add Microsoft’s Learn Docs MCP server in Copilot Studio, verify the tool, and query official docs—fast, first-party, step-by-step.

UPDATE—August 8, 2025: You no longer need to create a custom connector for the Microsoft Learn Docs MCP server. Copilot Studio now includes a native Microsoft Learn Docs MCP Server under Add tool → Model Context Protocol.
This guide has been updated to show the first-party path. If your tenant doesn’t yet show the native tile, use the Legacy approach at the bottom.

What changed

  • No YAML or custom connector required
  • Fewer steps, faster setup

Model Context Protocol (MCP) is the universal “USB-C” port for AI agents. It standardizes how a model discovers tools, streams data, and fires off actions—no bespoke SDKs, no brittle scraping. Add an MCP server and your agent instantly inherits whatever resources, tools, and prompts that server exposes, auto-updating as the backend evolves.

  1. Why you should care
  2. What the Microsoft Learn Docs MCP Server delivers
  3. Prerequisites
  4. Step 1 – Add the native Microsoft Learn Docs MCP Server
  5. Step 2 – Validate
  6. Legacy approach (if the native tile isn’t available)

Why you should care

  • Zero-integration overhead – connect in a click inside Copilot Studio or VS Code; the protocol handles tool discovery and auth.
  • Future-proof – the spec just hit GA and already ships in Microsoft, GitHub, and open-source stacks.
  • Hallucination killer – answers are grounded in authoritative servers rather than fuzzy internet guesses.

What the Microsoft Learn Docs MCP Server delivers

  • Tools: microsoft_docs_search – fire a plain-English query and stream back markdown-ready excerpts, links, and code snippets from official docs.
  • Always current – pulls live content from Learn, so your agent cites the newest releases and preview APIs automatically.
  • First-party & fast — add it in seconds from the Model Context Protocol gallery; no OpenAPI import needed.

Bottom line: MCP turns documentation (or any backend) into a first-class superpower for your agents—and the Learn Docs server is the showcase. Connect once, answer everything.

Prerequisites

  • Copilot Studio environment with Generative Orchestration (might need early features on)
  • Environment-maker rights
  • Outbound HTTPS to learn.microsoft.com/api/mcp

Step 1 – Add the native Microsoft Learn Docs MCP Server

  1. Go to Copilot Studio: https://copilotstudio.microsoft.com/
  2. Go to Tools → Add tool.
  3. Select the Model Context Protocol pill.
  4. Click Microsoft Learn Docs MCP Server.
  5. Choose the connection (usually automatic) and click Add to agent.
  6. Confirm the connection status is Connected.
Copilot Studio Add tool panel showing Model Context Protocol category and Microsoft Learn Docs MCP Server tile highlighted.
  1. The MCP server should now show up in Tools.
  1. Click the Server to verify the tool(s) and to make sure:
    – ✅ Allow agent to decide dynamically when to use this tool
    – Ask the end user before running = No
    – Credentials to use = End user credentials

Step 2 – Validate

  1. In the Test your agent pane. Turn on Activity map by clicking the wavy map icon:

  2. Now try a prompt like:
    What MS certs should I look at for Power Platform?
    How can I extend the Power Platform CoE Starter Kit?
    What modern controls in Power Apps are GA and which are still in preview? Format as a table

Use-Case Ideas

  • Internal help-desk bot that cites docs.
  • Learning-path recommender (your pipeline example).
  • Governance bot that checks best-practice-links.

Troubleshooting Cheat-Sheet

  • Note that currently the Learn Docs MCP server does NOT require authentication. This will most likely change in the future.
  • If Model Context Protocol is not shown in your Tools for Copilot Studio. You may need to create an environment with Early Features turned on.
  • Do NOT reference the MCP server in the agents instructions, you will get a tool error.
  • Check Activity tab for monitoring

Legacy approach (if the native tile isn’t available)

Grab the Minimal YAML

  1. Open your favorite code editor or notepad. Copy and paste this YAML to a new file.
swagger: '2.0'
info:
  title: Microsoft Docs MCP
  description: Streams Microsoft official documentation to AI agents via Model Context Protocol
  version: 1.0.0
host: learn.microsoft.com
basePath: /api
schemes:
  - https
paths:
  /mcp:
    post:
      summary: Invoke Microsoft Docs MCP server
      x-ms-agentic-protocol: mcp-streamable-1.0
      operationId: InvokeDocsMcp
      consumes:
        - application/json
      produces:
        - application/json
      responses:
        '200':
          description: Success
  1. Save the file with .yaml extension.

Import a Custom Connector

Next we need to create a custom connector for the MCP server to connect to. We will do this by importing our yaml file we created in Step 1.

  1. Go to make.powerapps.com > Custom connectors > + New custom connector > Import OpenAPI.

  2. Upload your yaml file eg: ms-docs‑mcp.yaml, using the Import an OpenAPI file option.

  3. General tab: Confirm Host and Base URL.
    Host: learn.microsoft.com
    Base URL: /api
  4. Security tab > No authentication (the Docs MCP server is anonymously readable today).
  5. Definition tab > verify one action named InvokeDocsMcp is present.
    Also add a description.

  6. Click Create connector. Once the connector is created, click the Test tab, and click +New Connection.

    (Note, you may see more than 1 Operation after creating the connector. Don’t worry and continue on)
  7. When you create a connection, you will be navigated away from your custom connector. Verify your Connection is in Connected Status.

    Next we will wire this up to our Agent in Copilot Studio.

Get the difference between two dates (Updated 2025)

Many Power Automate users encounter issues with the dateDifference() function when calculating the difference between two dates. The problem arises when the output format varies depending on the duration, causing errors in extracting Days, Hours, Minutes, and Seconds.

This blog provides a robust and easy-to-implement solution that works seamlessly in all scenarios, including durations less than a day. Learn how to use a single expression with conditional logic to avoid these common pitfalls and ensure your date calculations are accurate every time. This is your ultimate fix for handling dateDifference() errors!

  1. The Flow
    1. dateDifference expression
      1. How it works
    2. Steps to Access Each Value
  2. Download my Flow
    1. Classic designer
    2. New designer
  3. Conclusion

The Flow

  1. Compose action: named StartDate = 2024-12-10T15:58:28
  2. Compose action: named EndDate = 2024-12-10T19:22:20
  3. Compose action: uses dateDifference() expression. see below

Below is the expression used in the ‘Date Difference’ compose action. It dynamically handles all scenarios—when days are included and when they are not (same with hours and minutes).

dateDifference expression

Create a compose action for StartDate and EndDate

if(
   contains(
     dateDifference(outputs('StartDate'), outputs('EndDate')), 
     '.'
   ),
   json(
     concat(
       '{"Days":', string(int(split(dateDifference(outputs('StartDate'), outputs('EndDate')), '.')[0])),
       ',"Hours":', string(int(split(split(dateDifference(outputs('StartDate'), outputs('EndDate')), '.')[1], ':')[0])),
       ',"Minutes":', string(int(split(split(dateDifference(outputs('StartDate'), outputs('EndDate')), '.')[1], ':')[1])),
       ',"Seconds":', string(int(split(split(dateDifference(outputs('StartDate'), outputs('EndDate')), '.')[1], ':')[2])),
       '}'
     )
   ),
   json(
     concat(
       '{"Days":0',
       ',"Hours":', string(int(split(dateDifference(outputs('StartDate'), outputs('EndDate')), ':')[0])),
       ',"Minutes":', string(int(split(dateDifference(outputs('StartDate'), outputs('EndDate')), ':')[1])),
       ',"Seconds":', string(int(split(dateDifference(outputs('StartDate'), outputs('EndDate')), ':')[2])),
       '}'
     )
   )
)

How it works

  • The if() function checks if the dateDifference() result contains a . (dot).
  • If it does, it means the result has a days component (e.g., 1268.04:15:30), so we parse out Days, Hours, Minutes, and Seconds accordingly.
  • If it does not, it means the result is less than a day (e.g., 12:57:47.2544602), so we treat Days as 0 and parse Hours, Minutes, and Seconds directly from the string.

Result:

This will produce a JSON object like:
{
"Days": 1268,
"Hours": 4,
"Minutes": 15,
"Seconds": 30
}

Or
{
"Days": 0,
"Hours": 12,
"Minutes": 57,
"Seconds": 47
}

Steps to Access Each Value

If you use the fixed expression directly in a Compose action (e.g., named Date_Difference), you can reference the fields like this:

  • Days: outputs('Date_Difference')?['Days']
  • Hours: outputs('Date_Difference')?['Hours']
  • Minutes: outputs('Date_Difference')?['Minutes']
  • Seconds: outputs('Date_Difference')?['Seconds']

Use these expressions in subsequent actions (like another Compose, a Condition, or Apply to Each) to reference the specific values.

Download my Flow

You can easily copy and paste actions in Power Automate. Allowing you to copy and paste my example.

  1. Classic designer
  2. New designer

Classic designer

Step 1: Copy the code snippet

{"id":"b6b531e2-b7b5-4a9e-86bd-7e2a069529a0","brandColor":"#8C3900","connectionReferences":{},"connectorDisplayName":"Control","icon":"data:image/svg+xml;base64,PHN2ZyB3aWR0aD0iMzIiIGhlaWdodD0iMzIiIHZlcnNpb249IjEuMSIgdmlld0JveD0iMCAwIDMyIDMyIiB4bWxucz0iaHR0cDovL3d3dy53My5vcmcvMjAwMC9zdmciPg0KIDxwYXRoIGQ9Im0wIDBoMzJ2MzJoLTMyeiIgZmlsbD0iIzhDMzkwMCIvPg0KIDxwYXRoIGQ9Im04IDEwaDE2djEyaC0xNnptMTUgMTF2LTEwaC0xNHYxMHptLTItOHY2aC0xMHYtNnptLTEgNXYtNGgtOHY0eiIgZmlsbD0iI2ZmZiIvPg0KPC9zdmc+DQo=","isTrigger":false,"operationName":"Get_date_difference_object","operationDefinition":{"type":"Scope","actions":{"StartDate":{"type":"Compose","inputs":"2024-12-10T15:58:28","runAfter":{}},"EndDate":{"type":"Compose","inputs":"2024-12-10T19:22:20","runAfter":{"StartDate":["Succeeded"]}},"Date_Difference":{"type":"Compose","inputs":"@if(\r\n   contains(\r\n     dateDifference(outputs('StartDate'), outputs('EndDate')), \r\n     '.'\r\n   ),\r\n   json(\r\n     concat(\r\n       '{\"Days\":', string(int(split(dateDifference(outputs('StartDate'), outputs('EndDate')), '.')[0])),\r\n       ',\"Hours\":', string(int(split(split(dateDifference(outputs('StartDate'), outputs('EndDate')), '.')[1], ':')[0])),\r\n       ',\"Minutes\":', string(int(split(split(dateDifference(outputs('StartDate'), outputs('EndDate')), '.')[1], ':')[1])),\r\n       ',\"Seconds\":', string(int(split(split(dateDifference(outputs('StartDate'), outputs('EndDate')), '.')[1], ':')[2])),\r\n       '}'\r\n     )\r\n   ),\r\n   json(\r\n     concat(\r\n       '{\"Days\":0',\r\n       ',\"Hours\":', string(int(split(dateDifference(outputs('StartDate'), outputs('EndDate')), ':')[0])),\r\n       ',\"Minutes\":', string(int(split(dateDifference(outputs('StartDate'), outputs('EndDate')), ':')[1])),\r\n       ',\"Seconds\":', string(int(split(dateDifference(outputs('StartDate'), outputs('EndDate')), ':')[2])),\r\n       '}'\r\n     )\r\n   )\r\n)","runAfter":{"EndDate":["Succeeded"]},"metadata":{"operationMetadataId":"03c8d578-576a-41a3-8d63-609a15ce594b"}}},"runAfter":{"Add_to_time":["Succeeded"]}}}

Step 2: In Power Automate when adding a new action click My clipboard .

Step 3: Ctrl + V


New designer

Step 1: Copy the code snippet

{"nodeId":"Get_date_difference_object-copy","serializedOperation":{"type":"Scope","actions":{"StartDate":{"type":"Compose","inputs":"2024-12-10T15:58:28"},"EndDate":{"type":"Compose","inputs":"2024-12-10T19:22:20","runAfter":{"StartDate":["Succeeded"]}},"Date_Difference":{"type":"Compose","inputs":"@if(\r\n   contains(\r\n     dateDifference(outputs('StartDate'), outputs('EndDate')), \r\n     '.'\r\n   ),\r\n   json(\r\n     concat(\r\n       '{\"Days\":', string(int(split(dateDifference(outputs('StartDate'), outputs('EndDate')), '.')[0])),\r\n       ',\"Hours\":', string(int(split(split(dateDifference(outputs('StartDate'), outputs('EndDate')), '.')[1], ':')[0])),\r\n       ',\"Minutes\":', string(int(split(split(dateDifference(outputs('StartDate'), outputs('EndDate')), '.')[1], ':')[1])),\r\n       ',\"Seconds\":', string(int(split(split(dateDifference(outputs('StartDate'), outputs('EndDate')), '.')[1], ':')[2])),\r\n       '}'\r\n     )\r\n   ),\r\n   json(\r\n     concat(\r\n       '{\"Days\":0',\r\n       ',\"Hours\":', string(int(split(dateDifference(outputs('StartDate'), outputs('EndDate')), ':')[0])),\r\n       ',\"Minutes\":', string(int(split(dateDifference(outputs('StartDate'), outputs('EndDate')), ':')[1])),\r\n       ',\"Seconds\":', string(int(split(dateDifference(outputs('StartDate'), outputs('EndDate')), ':')[2])),\r\n       '}'\r\n     )\r\n   )\r\n)","runAfter":{"EndDate":["Succeeded"]},"metadata":{"operationMetadataId":"03c8d578-576a-41a3-8d63-609a15ce594b"}}},"runAfter":{"Add_to_time":["Succeeded"]}},"allConnectionData":{},"staticResults":{},"isScopeNode":true,"mslaNode":true}

Step 2: In Power Automate click the + to add an action. Click Paste an action

Conclusion

That’s it! pretty easy right? if you encounter any issues, comment below!

Get the difference between two dates EASY

We have all been there, we need to check the difference between 2 dates, and if you ever had to implement this you would need to use some crazy mathematical equations using the ticks() expression. But now..

I’m not sure when this expression got added, but we can now use dateDifference() expression instead of using ticks().

The dateDifference() expression is a powerful tool in Power Automate and Logic Apps for calculating the difference between two dates.

Allowing to easily determine the number of days, months, or years between two dates, which can be useful in a variety of scenarios.

  1. Syntax and Parameters
  2. How to Use
  3. Extracting the Result
    1. Extracting Days
    2. Extracting Hours
    3. Extracting Minutes
    4. Extracting Seconds
  4. Things to Know
  5. Links

Syntax and Parameters

The syntax is easy with only 2 parameters:

dateDifference('<startDate>', '<endDate>')

How to Use

Below is a simple example of how to use this expression:

dateDifference('2015-02-08T10:30:00', '2018-07-30T14:45:30')

This returns

"1268.04:15:30"

The result is in the format of:
Days.Hours:Minutes:Seconds

Note:: If the dates passed in have no time interval, the result shows zeros for the hours, minutes, and seconds. We can extract the different parts of the return by using some expressions inside a Compose action, which we will do next.

Extracting the Result

If you need to extract certain parts of the result into the hours, minutes, or even seconds, you can use the split() expression.
Below you will find the explanation on the extraction, as well as the exact expressions to use.

  • The split() function splits the output of dateDifference() at the period (‘.’) into an array with two elements: days and the rest (hours:minutes:seconds).
  • The [0] indexer retrieves the first element of the array, which represents the number of days.
  • The int() function converts the days from a string to an integer.
  • Replace the date time values with your dates/time

Extracting Days

To extract the days from the result we can use

int(split(dateDifference('2015-02-08T10:30:00', '2018-07-30T14:45:30'), '.')[0])

This returns:

1268

Extracting Hours

To extract the hours interval from the result we can use

int(split(split(dateDifference('2015-02-08T10:30:00', '2018-07-30T14:45:30'), '.')[1], ':')[0])

This returns:

4

Extracting Minutes

To extract the minutes interval from the result we can use

int(split(split(dateDifference('2015-02-08T10:30:00', '2018-07-30T14:45:30'), '.')[1], ':')[1])

This returns:

15

Extracting Seconds

To extract the seconds interval from the result we can use

int(split(split(dateDifference('2015-02-08T10:30:00', '2018-07-30T14:45:30'), '.')[1], ':')[2])

This returns:

30

Things to Know

There are a few things to be aware of:

  • Be aware of time zones, Power Automate uses UTC as a baseline for all time formats.
  • If pulling dates from SharePoint be aware of what time zone your site is in.
  • You can convert the time zones by using expressions or by using actions. Read more about converting time zones here.

date Difference – Reference guide for expression functions – Azure Logic Apps | Microsoft Learn

Tip For Testing Your Flows In Power Automate

Wouldn’t it be nice if we can Test our Flows without executing some of the actions like Sending Emails, creating items in SharePoint or Dataverse?

Guess what we can! And its very easy to do. Check this out!

If your like me, you test your Flows over and over again. This results in sending unwanted emails, creating items in SharePoint or Dataverse, Creating files on OneDrive or SharePoint.
Every time you test your Flow, these actions inside our Flow get executed and cause unwanted behavior when Testing.

Wouldn’t it be nice if we can Test our Flows without executing these actions? Guess what we can! And its very easy to do. Check this out!

Scenario

For example, I have a Flow that Create a new row in Dataverse, and then send an email to the person who created the new row. That is fine, but what happens when we have other actions in our Flow that we want to test to make sure they are correct.
I may want to test the Flow multiple times if I am doing some data manipulation, but this will result in Creating multiple unwanted rows (records) in Dataverse, as well as send emails every time.

We can clean up the testing process easily.

How?

We can utilize a feature called Static Result.

First click the 3 dots on the action, and select Static Results.

Next we can configure the static results. For easy example click the radio button to enable, select your Status, and the Status Code.

Click Done.

Now the action will have a yellow beaker, indicating that the action is using Static results.

Things to note:
– Static Result are in ‘Preview’ so it could change at any time
– Not all actions will be able to use them
– If the option is greyed out, and you’re certain the action is able to use it, save the Flow and re open

This is only the beginning, as you can create a custom failed response, or create any result you want. This can help troubleshooting and testing certain scenarios.

REMEMBER!! To turn off static results when you want to execute the actions like normal.

Examples

Some examples on when to use static results:

  • Flow runs without sending emails
  • Flow runs without Approvals needed
  • Flow runs that need to test errors on certain actions
  • Flow runs testing different error codes (Advanced) + Custom error codes

Conclusion

I have used this feature for awhile now, and noticed not many know about it. It’s so useful in many testing scenarios. Just remember to disable the static results once your done testing!

If you have any questions or want to add anything please leave a comment and like this post! Thank you!

Power Apps Choosing Which Connections To Use Using Power Automate

Uploading data from Power Apps can be scary on a security standpoint, since the user will need access to the Data Source. Lets use Child Flows to get around this, and use any connection we want.

You may have run into an issue when creating Power Apps that needs to submit data to SharePoint, Dataverse, etc. But did not want to give everyone in the app access to these.
The problem is, Power Apps uses the connections of the user using the app, meaning if the app writes to a SharePoint List, the user will need Read/Write access.
The same goes for Power Automate if we try to send the data to Power Automate from Power Apps, it still uses the users connection who triggered the Flow.
How can we get around this? Read below!

Table of Contents


Known Issues

  1. If you block the HTTP Request connector via data loss prevention (DLP), child flows are also blocked because child flows are implemented using the HTTP connector. Work is underway to separate DLP enforcement for child flows so that they are treated like other cloud flows.
  2. You must create the parent flow and all child flows directly in the same solution. If you import a flow into a solution, you will get unexpected results.
    Call Child Flows – Power Automate | Microsoft Docs

Prerequisites

  1. The Flows must be created inside the same Solution, so a Dataverse database must be configured on the Power Platform Environment

The Scenario

In this scenario, I will be showing how a user can use Power Apps to create items in a SharePoint List without being a member of the Site. This will allow us to use a specific Service Account to create the data in SharePoint without giving the user in the app any permission at all!

First we will build the Child Flow, then Parent Flow, and lastly customize the Power App

Child Flow

Inside your Solution create a new Cloud Flow.

  1. For our trigger we use a Manual Button, and add the data we are expecting from Power Apps to put inside our SharePoint List
    (In my example I am only bringing in one field for Title)
  2. Next, I add a Create Item action for my SharePoint List, and add the Parameters from the trigger inside the action.
  3. Lastly, I add a ‘Respond to PowerApp or flow’ action, I create an Output called Success, and some details about what was created.
Child Flow


Make sure to use the Connection you want users of the App to use for the SharePoint Create item action.

Save and go back to the Flow dashboard screen (where you see the Details and run history screen).

There will be a Card on the right side called ‘Run only users’ click Edit

Run only users

Under Connections Used, switch ‘Provided by run-only user’ to the connection you want to be used by users of the App
(They wont have access to this Connection outside this Flow)

Run only user

Click Save,

Now onto the Parent Flow

Parent Flow

Go back to the Solution and Create another Cloud Flow.

  1. For our trigger we use the PowerApps button trigger.
  2. As best practice, create Variables for your data that is coming from Power Apps. Don’t forget to name them, as this will be the parameter name in Power Apps,
    Use the ‘Ask in PowerApps‘ dynamic content for your variable values.
  3. Next we use a action called ‘Run a Child Flow’
    (If you do not see this action, your Flow was not created inside a Solution)
    Add the parameters (these were the input parameters from the last Flow that we just created).
  4. Lastly, add ‘Respond to a PowerApp or flow’ action. For this demo I am adding the parameter ‘Success’ this is from the child Flow.


Click Save.

Power App

Now onto the Power App, I am going to create a simple Power App with 1 TextInput for Title, and a Button to Pass the data to Power Automate.
Here are my controls for reference:

TextInput_Title
Button_SendToFlow

For the Button:
1. Add the Flow to the button by clicking on the Button,
2. Clicking Action tab on top of page,
3. Clicking Power Automate
4. Select the Flow


Next add the parameters for the Flow, in my case I am adding the TextInput_Title.Text

Now, I want to add a Notification that the Item has been added, which will confirm my Flow has Run correctly. Ill be using the ‘Success’ Output parameter from the Flow for this.

To add this, I put my Flow run inside a Variable inside Power Apps. Ill call my variable Results, and IO add this to the OnSelect property of the Button where my Flow is:

Now I use the ‘Notify’ function to notify the user of the item being created, I add this after the semicolon. So my function looks like this in the end:


So my final code looks like this:

Set(
    Results,
    'PA-Trigger1'.Run(TextInput_Title.Text)
);
Notify(
    Results.success,
    NotificationType.Success
);
Reset(TextInput_Title)

Now lets test it!

Conclusion

I am using a User called ‘Demo User’ I have shared the App with this user. But they are not part of the SharePoint Site


Here is the SharePoint Site:

Now Logged in as the Demo User to test this:

Logged in as Demo User


Button Clicked >

Button Pressed, Flow Completed

Now to check SharePoint >

Test Success!!

Done!
So this was just a basic example on how we can create data inside a Data Source that the user of the App does not need access too.

Check Conditions In Power Automate During Run

Is your IF condition always evaluating to False? Debugging and Testing your Flows should be easy. When using a Condition in Power Automate, in the run we cannot see the expression or the results of what is being evaluated.

I will go over a quick workaround to debug and find out what is happening in the condition

The Problem?

Is your Condition not working as expected?
The problem is when we use a Condition action inside Power Automate, we cannot see the “equation” that is being evaluated when looking into the run.

The problem affects how we can troubleshoot, the following solution will show what is happening inside the Condition action during the run.

Scenario

In this scenario, I am checking:
If one value is greater than a second value

Now during a test run, I expect to see this condition true, but in my run it is always showing false and going in the If no branch.

The big problem is though, I cannot see what the values being evaluated look like. Take a look below

Clicking on the “Show raw inputs” is also not helpful..

Solution

So what is this quick and easy solution to see the condition results? A simple ‘Compose‘ action.

Lets take a look:
First add a Compose under your Condition

Next copy the values that are in the Condition to the Compose.
My Compose now looks like this:

Now make sure the Compose is above your Condition.
I am just dragging the Condition below the Compose

Next, we can run the Flow again, and see what the Compose can tell us:

Yikes! We can see our 2 values that are being evaluated are both 15.
And 15 is not greater than 15. This is why its returning false.

My Thoughts

In my opinion, this should be already visible inside the Condition action. To get this feature added to Power Automate, we can vote on this feature. Head over to the Community Forum and vote for this idea.

View details of Condition results in runs – Power Platform Community (microsoft.com)

The more votes, the better the chances of the Product team implementing this.
Thank you for reading, and have a great day!

Where Are My Flows When Building Power Virtual Agents In Teams

You added a Flow in Power Virtual Agents in Teams, you now want to edit that Flow. Where is it? Come check out the answer!

Overview

Building Power Virtual Agents (PVA) in Microsoft Teams is fast, easy, fun, and powerful, especially when we add Power Automate to the mix. A couple questions come up:

1. After the bot is build, how do we edit the Flows? Do we have to go into the PVA bot inside of Microsoft Teams?

2. Where are the Flows stored?

The Answer

The answer to the above questions, can be simplified into one response.

All Flows built inside the Teams environment for PVA chatbots are stored in the Teams environment under the Default Solution.
Now.. How do we get there?

Navigate to the Power Automate Web Portal
Power Automate | Microsoft Power Platform

Sign in, and select the environments menu in the top right and choose the Environment that correlates to your Teams name where you built the Bot.
My Microsoft Team name is ‘POC – Teams

Next navigate to the Solutions tab on the left, and select ‘Default Solution


Once inside the ‘Default Solution‘ we can see many different types of artifacts. To narrow this list down:
On the top right of the page there is a dropdown with different types. Select ‘Flow

That’s it. Now we can see all the Flows inside this Teams Environment.

Want to learn how to get user info from Office365 to use in Power Virtual Agents? Check out my blog on the flow you see above
Get User Info

Limitations

There are some limitations:
– There is no way to import a Flow into this Environment

– When using the Save As feature, the Flow is saved outside of the Solution, thus cannot be used for your PVA Bot in Teams

– When modifying the Flows Inputs and Outputs you will have to remove the Flow action inside of PVA to properly refresh.

Conclusion

If you need help with anything Power Platform related, check out the community sites:

Power Virtual Agents Community – Power Platform Community (microsoft.com)

Power Automate community (microsoft.com)

Power Apps community (microsoft.com)

Home – Microsoft Power BI Community

Getting Specific Files And IDs In SharePoint Using Power Automate

I encountered an issue when trying to filter a a file by filename, that was in a SharePoint document library.

When needing to get a specific SharePoint file ID. It can be troublesome to filter the Files. For example Using a ‘Get files’ action we can see that the properties of the file are encased inside {} meaning that SharePoint is using some calculation on the Document Library to create these fields.

Contents

This post will go over a common problem I have seen using SharePoint action to get a specific file ID.
There are three parts to this post:
The Problem
The Solution
Conclusion

The Problem?

I encountered an issue when trying to filter a a file by filename, that was in a SharePoint document library.

When needing to get a specific SharePoint file ID. It can be troublesome to filter the Files. For example Using a ‘Get files‘ action we can see that the properties of the file are encased inside {} meaning that SharePoint is using some calculation on the Document Library to create these fields.

Attempting to use a filter query on any of the fields encased between {} returns an error. For example I am trying to filter the file called ‘Readthis.txt‘ I get a ‘Bad Request’ Error:

The error message I receive is:

Column ‘Name’ does not exist. It may have been deleted by another user.
clientRequestId: 75462fg8-686d-48p1-5se7-e8l97jk84025
serviceRequestId: 75462fg8-686d-48p1-5se7-e8l97jk84025

I have read online that using Title as the column name is correct, although I do not get an error, the output is empty.

The Solution

Now the best part, the solution!

The way I have been using to filter out ‘Get files‘ action is by using the ‘Filter array‘ action. This action will allow us to filter the calculated columns that SharePoint is using, like {Name}, {Link}, {FilenameWithExtension}, Etc.

To get setup, we want the ‘Get files‘ action to pull in ALL files, so we don’t want to have any filter at this stage.

Now add a ‘Filter array‘ action, put the dynamic content value in the From field. On the left side select the dynamic content from the ‘Get files‘ action. The right side put what you want to filter on.
So for example, I want to filter and get the file ‘Readthis.txt‘. So my ‘Filter array‘ action looks like this:

Now when running the Flow, the ‘Filter array’ action is properly filtering out the filename:

Conclusion

I wrote this blog post based on a scenario I have helped solved on the Power Automate Community Forum.

Hopefully someone else find this information useful.

Thanks for reading!

Converting Time Zones Easily In Power Automate

Did you know that Power Automate has a Date Time action that can easily convert, and format time zones in one action?
Why is this important? Power Automate natively uses UTC as its time zone, as well as most SharePoint sites. Using an action can be easier than using expressions.

Summary

Did you know that Power Automate has a Date Time action that can easily convert, and format time zones in one action?
Why is this important? Power Automate natively uses UTC as its time zone, as well as most SharePoint sites. Using an action can be easier than using expressions.

The Flow

In this example, we will want to get the current time (this will be in UTC since we will be using Power Automate) and converting the time to local time with a specific format.

First we want to get the current time, we can use the expression utcNow() but I will be showing how to use the Date Time actions instead.

The actions are under Date Time:

Add a Current time action, this action is the same as using utcNow() expression

Next add Convert time zone action, this action is very useful as it has pre loaded time zones and formats to choose from.

The inputs for this action are:
Base time: Use the output from the Current time action
Source time zone: Make sure to select Coordinated Universal Time
Destination time zone: Select your local time zone or the time zone you want
Format string: This dropdown has many ISO formats to choose from. If you want to have a custom format, simply click the drop down and select Enter custom value. See below for examples

Format Examples

If for some reason the format you want is not in the dropdown for formats, you can create a custom format as long as it follows ISO 8601 format. To add a custom format click Enter custom value in the dropdown

Some tips when creating a format for the date ‘2020-10-13‘ (October 13 2020)
yy = 20
yyyy = 2020

MM = 10
MMM = Oct
MMMM = October

dd = 13
ddd = Tue
dddd = Tuesday

Examples:

yyyy-MMM-ddd = 2020-Oct-Tue
yy/MMMM/dddd = 20/October/Tuesday
dddd, MMMM, yyyy = Tuesday, October, 2020
MMMM dd, yyyy = October 13, 2020
yyyy-MM-dd = 2020-10-13 (used for comparing dates)

To add time to your format use the following in your format:
(It is best practice to add the letter ‘T’ before using time formats)

h = hours (12 hour time)
hh = hours (12 hour time)
HH = hours (24 hour time)
mm = minutes
ss = seconds
tt = Appends either AM or PM to time

Some examples are:
MMMM dd, yyyyThh:mm = October 13, 2020T12:51
MMMM/dd/yyyyTHH:mm:ss = October/13/2020T13:02:41
hh:mm:ss tt = 01:06:41 PM
h:mm:ss tt = 12:06:41 PM

Conclusion

Knowing these formats and the what each letter code does, the possibilities are endless. You can create any type of custom date time format easily.

As always if you have any questions, don’t hesitate to reach out.

Thank you for reading!