We have all been there, we need to check the difference between 2 dates, and if you ever had to implement this you would need to use some crazy mathematical equations using the ticks() expression. But now..
I’m not sure when this expression got added, but we can now use dateDifference() expression instead of using ticks().
The dateDifference() expression is a powerful tool in Power Automate and Logic Apps for calculating the difference between two dates.
Allowing to easily determine the number of days, months, or years between two dates, which can be useful in a variety of scenarios.
The result is in the format of: Days.Hours:Minutes:Seconds
Note:: If the dates passed in have no time interval, the result shows zeros for the hours, minutes, and seconds. We can extract the different parts of the return by using some expressions inside a Compose action, which we will do next.
Extracting the Result
If you need to extract certain parts of the result into the hours, minutes, or even seconds, you can use the split() expression. Below you will find the explanation on the extraction, as well as the exact expressions to use.
The split() function splits the output of dateDifference() at the period (‘.’) into an array with two elements: days and the rest (hours:minutes:seconds).
The [0] indexer retrieves the first element of the array, which represents the number of days.
The int() function converts the days from a string to an integer.
Wouldn’t it be nice if we can Test our Flows without executing some of the actions like Sending Emails, creating items in SharePoint or Dataverse?
Guess what we can! And its very easy to do. Check this out!
If your like me, you test your Flows over and over again. This results in sending unwanted emails, creating items in SharePoint or Dataverse, Creating files on OneDrive or SharePoint. Every time you test your Flow, these actions inside our Flow get executed and cause unwanted behavior when Testing.
Wouldn’t it be nice if we can Test our Flows without executing these actions? Guess what we can! And its very easy to do. Check this out!
Scenario
For example, I have a Flow that Create a new row in Dataverse, and then send an email to the person who created the new row. That is fine, but what happens when we have other actions in our Flow that we want to test to make sure they are correct. I may want to test the Flow multiple times if I am doing some data manipulation, but this will result in Creating multiple unwanted rows (records) in Dataverse, as well as send emails every time.
We can clean up the testing process easily.
How?
We can utilize a feature called Static Result.
First click the 3 dots on the action, and select Static Results.
Next we can configure the static results. For easy example click the radio button to enable, select your Status, and the Status Code.
Click Done.
Now the action will have a yellow beaker, indicating that the action is using Static results.
Things to note: – Static Result are in ‘Preview’ so it could change at any time – Not all actions will be able to use them – If the option is greyed out, and you’re certain the action is able to use it, save the Flow and re open
This is only the beginning, as you can create a custom failed response, or create any result you want. This can help troubleshooting and testing certain scenarios.
REMEMBER!! To turn off static results when you want to execute the actions like normal.
Examples
Some examples on when to use static results:
Flow runs without sending emails
Flow runs without Approvals needed
Flow runs that need to test errors on certain actions
I have used this feature for awhile now, and noticed not many know about it. It’s so useful in many testing scenarios. Just remember to disable the static results once your done testing!
If you have any questions or want to add anything please leave a comment and like this post! Thank you!
Uploading data from Power Apps can be scary on a security standpoint, since the user will need access to the Data Source. Lets use Child Flows to get around this, and use any connection we want.
You may have run into an issue when creating Power Apps that needs to submit data to SharePoint, Dataverse, etc. But did not want to give everyone in the app access to these. The problem is, Power Apps uses the connections of the user using the app, meaning if the app writes to a SharePoint List, the user will need Read/Write access. The same goes for Power Automate if we try to send the data to Power Automate from Power Apps, it still uses the users connection who triggered the Flow. How can we get around this? Read below!
If you block the HTTP Request connector via data loss prevention (DLP), child flows are also blocked because child flows are implemented using the HTTP connector. Work is underway to separate DLP enforcement for child flows so that they are treated like other cloud flows.
You must create the parent flow and all child flows directly in the same solution. If you import a flow into a solution, you will get unexpected results. Call Child Flows – Power Automate | Microsoft Docs
Prerequisites
The Flows must be created inside the same Solution, so a Dataverse database must be configured on the Power Platform Environment
The Scenario
In this scenario, I will be showing how a user can use Power Apps to create items in a SharePoint List without being a member of the Site. This will allow us to use a specific Service Account to create the data in SharePoint without giving the user in the app any permission at all!
First we will build the Child Flow, then Parent Flow, and lastly customize the Power App
Child Flow
Inside your Solution create a new Cloud Flow.
For our trigger we use a Manual Button, and add the data we are expecting from Power Apps to put inside our SharePoint List (In my example I am only bringing in one field for Title)
Next, I add a Create Item action for my SharePoint List, and add the Parameters from the trigger inside the action.
Lastly, I add a ‘Respond to PowerApp or flow’ action, I create an Output called Success, and some details about what was created.
Child Flow
Make sure to use the Connection you want users of the App to use for the SharePoint Create item action.
Save and go back to the Flow dashboard screen (where you see the Details and run history screen).
There will be a Card on the right side called ‘Run only users’ click Edit
Run only users
Under Connections Used, switch ‘Provided by run-only user’ to the connection you want to be used by users of the App (They wont have access to this Connection outside this Flow)
Run only user
Click Save,
Now onto the Parent Flow
Parent Flow
Go back to the Solution and Create another Cloud Flow.
For our trigger we use the PowerApps button trigger.
As best practice, create Variables for your data that is coming from Power Apps. Don’t forget to name them, as this will be the parameter name in Power Apps, Use the ‘Ask in PowerApps‘ dynamic content for your variable values.
Next we use a action called ‘Run a Child Flow’ (If you do not see this action, your Flow was not created inside a Solution) Add the parameters (these were the input parameters from the last Flow that we just created).
Lastly, add ‘Respond to a PowerApp or flow’ action. For this demo I am adding the parameter ‘Success’ this is from the child Flow.
Click Save.
Power App
Now onto the Power App, I am going to create a simple Power App with 1 TextInput for Title, and a Button to Pass the data to Power Automate. Here are my controls for reference:
TextInput_Title Button_SendToFlow
For the Button: 1. Add the Flow to the button by clicking on the Button, 2. Clicking Action tab on top of page, 3. Clicking Power Automate 4. Select the Flow
Next add the parameters for the Flow, in my case I am adding the TextInput_Title.Text
Now, I want to add a Notification that the Item has been added, which will confirm my Flow has Run correctly. Ill be using the ‘Success’ Output parameter from the Flow for this.
To add this, I put my Flow run inside a Variable inside Power Apps. Ill call my variable Results, and IO add this to the OnSelect property of the Button where my Flow is:
Now I use the ‘Notify’ function to notify the user of the item being created, I add this after the semicolon. So my function looks like this in the end:
Is your IF condition always evaluating to False? Debugging and Testing your Flows should be easy. When using a Condition in Power Automate, in the run we cannot see the expression or the results of what is being evaluated.
I will go over a quick workaround to debug and find out what is happening in the condition
The Problem?
Is your Condition not working as expected? The problem is when we use a Condition action inside Power Automate, we cannot see the “equation” that is being evaluated when looking into the run.
The problem affects how we can troubleshoot, the following solution will show what is happening inside the Condition action during the run.
Scenario
In this scenario, I am checking: If one value is greater than a second value
Now during a test run, I expect to see this condition true, but in my run it is always showing false and going in the If no branch.
The big problem is though, I cannot see what the values being evaluated look like. Take a look below
Clicking on the “Show raw inputs” is also not helpful..
Solution
So what is this quick and easy solution to see the condition results? A simple ‘Compose‘ action.
Lets take a look: First add a Compose under your Condition
Next copy the values that are in the Condition to the Compose. My Compose now looks like this:
Now make sure the Compose is above your Condition. I am just dragging the Condition below the Compose
Next, we can run the Flow again, and see what the Compose can tell us:
Yikes! We can see our 2 values that are being evaluated are both 15. And 15 is not greater than 15. This is why its returning false.
My Thoughts
In my opinion, this should be already visible inside the Condition action. To get this feature added to Power Automate, we can vote on this feature. Head over to the Community Forum and vote for this idea.
You added a Flow in Power Virtual Agents in Teams, you now want to edit that Flow. Where is it? Come check out the answer!
Overview
Building Power Virtual Agents (PVA) in Microsoft Teams is fast, easy, fun, and powerful, especially when we add Power Automate to the mix. A couple questions come up:
1. After the bot is build, how do we edit the Flows? Do we have to go into the PVA bot inside of Microsoft Teams?
2. Where are the Flows stored?
The Answer
The answer to the above questions, can be simplified into one response.
All Flows built inside the Teams environment for PVA chatbots are stored in the Teams environment under the Default Solution. Now.. How do we get there?
Sign in, and select the environments menu in the top right and choose the Environment that correlates to your Teams name where you built the Bot. My Microsoft Team name is ‘POC – Teams‘
Next navigate to the Solutions tab on the left, and select ‘Default Solution‘
Once inside the ‘Default Solution‘ we can see many different types of artifacts. To narrow this list down: On the top right of the page there is a dropdown with different types. Select ‘Flow‘
That’s it. Now we can see all the Flows inside this Teams Environment.
Want to learn how to get user info from Office365 to use in Power Virtual Agents? Check out my blog on the flow you see above Get User Info
Limitations
There are some limitations: – There is no way to import a Flow into this Environment
– When using the Save As feature, the Flow is saved outside of the Solution, thus cannot be used for your PVA Bot in Teams
– When modifying the Flows Inputs and Outputs you will have to remove the Flow action inside of PVA to properly refresh.
Conclusion
If you need help with anything Power Platform related, check out the community sites:
I encountered an issue when trying to filter a a file by filename, that was in a SharePoint document library.
When needing to get a specific SharePoint file ID. It can be troublesome to filter the Files. For example Using a ‘Get files’ action we can see that the properties of the file are encased inside {} meaning that SharePoint is using some calculation on the Document Library to create these fields.
Contents
This post will go over a common problem I have seen using SharePoint action to get a specific file ID. There are three parts to this post: The Problem The Solution Conclusion
The Problem?
I encountered an issue when trying to filter a a file by filename, that was in a SharePoint document library.
When needing to get a specific SharePoint file ID. It can be troublesome to filter the Files. For example Using a ‘Get files‘ action we can see that the properties of the file are encased inside {} meaning that SharePoint is using some calculation on the Document Library to create these fields.
Attempting to use a filter query on any of the fields encased between {} returns an error. For example I am trying to filter the file called ‘Readthis.txt‘ I get a ‘Bad Request’ Error:
The error message I receive is:
Column ‘Name’ does not exist. It may have been deleted by another user. clientRequestId: 75462fg8-686d-48p1-5se7-e8l97jk84025 serviceRequestId: 75462fg8-686d-48p1-5se7-e8l97jk84025
I have read online that using Title as the column name is correct, although I do not get an error, the output is empty.
The Solution
Now the best part, the solution!
The way I have been using to filter out ‘Get files‘ action is by using the ‘Filter array‘ action. This action will allow us to filter the calculated columns that SharePoint is using, like {Name}, {Link}, {FilenameWithExtension}, Etc.
To get setup, we want the ‘Get files‘ action to pull in ALL files, so we don’t want to have any filter at this stage.
Now add a ‘Filter array‘ action, put the dynamic content value in the From field. On the left side select the dynamic content from the ‘Get files‘ action. The right side put what you want to filter on. So for example, I want to filter and get the file ‘Readthis.txt‘. So my ‘Filter array‘ action looks like this:
Now when running the Flow, the ‘Filter array’ action is properly filtering out the filename:
Did you know that Power Automate has a Date Time action that can easily convert, and format time zones in one action?
Why is this important? Power Automate natively uses UTC as its time zone, as well as most SharePoint sites. Using an action can be easier than using expressions.
Summary
Did you know that Power Automate has a Date Time action that can easily convert, and format time zones in one action? Why is this important? Power Automate natively uses UTC as its time zone, as well as most SharePoint sites. Using an action can be easier than using expressions.
The Flow
In this example, we will want to get the current time (this will be in UTC since we will be using Power Automate) and converting the time to local time with a specific format.
First we want to get the current time, we can use the expression utcNow() but I will be showing how to use the Date Time actions instead.
The actions are under Date Time:
Add a Current time action, this action is the same as using utcNow() expression
Next add Convert time zone action, this action is very useful as it has pre loaded time zones and formats to choose from.
The inputs for this action are: Base time: Use the output from the Current time action Source time zone: Make sure to select Coordinated Universal Time Destination time zone: Select your local time zone or the time zone you want Format string: This dropdown has many ISO formats to choose from. If you want to have a custom format, simply click the drop down and select Enter custom value. See below for examples
Format Examples
If for some reason the format you want is not in the dropdown for formats, you can create a custom format as long as it follows ISO 8601 format. To add a custom format click Enter custom value in the dropdown
Some tips when creating a format for the date ‘2020-10-13‘ (October 13 2020) yy = 20 yyyy = 2020
Deploying Power Automate Flows can be a headache if you have to manually change values inside the Flow for each environment. You also run the risk of missing a value.
Summary
Deploying Power Automate Flows can be a headache if you have to manually change values inside the Flow for each environment. You also run the risk of missing a value.
This post will go in depth on using Environment variables inside solutions to allow certain values to be used in different environments. I will be using Two(2) environments for this demo: Dev, and Test
This demo will utilize the data type ‘JSON’, this will save loads of time.
Terms / Glossary
Here are some familiar terms that will help in this post: Default Value = The value that you’re expecting after you deploy the Flow. This will be our Test environment values
Current Value = The value that overrides the Default Value. This will be our Dev environment values
Parameters = These are just values. For example 2 + 2. Each 2 is a parameter of the addition expression (+)
** Note: I Have created this guide to be as simple as possible, If you name everything as I do, you will be able to copy and paste all my expressions directly into your Flow **
The Scenario
My Flow posts a message into Microsoft Teams group. When deploying the Flow into different environments, I want to change the Teams content like:
Team Name
Channel ID
Subject
Usually after deploying, we can go in to the Flow and manually change the values. This can cause errors, especially if we forget to change some values after deploying (I may have done this a few times).
Getting Parameter Values
It is important to note that not all Action values can be parameterized. Follow the steps below to see if the values can be parameterized:
Teams Example: My Teams action, I want to parameterize the Team Name, Channel ID, and Subject. For this I add the Action, and select the values as needed.
Now with the information filled in, click the 3 dots ‘. . .’ on the action, and click ‘Peek Code’.
Dev Parameters
In the ‘Peek code’ view, we can see the different parameters and the values that this action uses in the background. Copy these values into notepad or code editor for use later. I have created a simple JSON to be placed in my Environment Variable later on.
I will be using the Env value for my Subject in the teams message
For example: Team = 1861402b-5d49-4850-a54b-5eda568e7e8b Channel = 19:be3759762df64d94a509938aa9962b29@thread.tacv2 Subject = Message From Dev Environment
To test that we can use custom values as parameters, we want to grab these values from above and insert them into the ‘Teams Post a message’ action as custom value, than run the Flow
Mine looks like this now:
Now run the Flow to make sure everything works as expected using the Custom values
Now that we know Custom values work for the inputs/parameters for the action. We now want to get the values for the other environment. Remove the custom values from the inputs and choose the correct value that we want to point to when this Flow is deployed to another environment. For example:
Again we do a ‘Peek code’ to get the parameter IDs that this action uses
Test Parameters
Copy these values into notepad or a code editor for use later. Now we have two sets of values, one for Dev, and one for Test. I have created a simple JSON to be placed in my Environment Variable later on.
I will be using the Env value for my Subject in the teams message
Make sure the Two(2) JSONs have the same structure. We will be using the data to populate the Environment Variables in the following steps
Creating Environment Variables
Environment variables can only be created inside a solution, there are multiple ways you or your organization may want this set up.
In this demo I have created a Publisher in CDS called ‘param’, this will better define the parameters that we create inside the solution. (this is optional) The default CDS publisher could also be used.
Create a solution inside Power Automate. Click Solutions on the left navigation menu, Than click ‘New Solution’ on the top on menu, and fill the information out
Once the solution is created, Click ‘New’ > Environment variable
Now fill in the information like the screenshot below.
Note, I will be using the Data Type as JSON. I use this for simplicity, as I have more than one value I want to add. Also we can use the Parse JSON action in Flow to reference each value separately. You can use any Data Type you like
Now we populate the values we want to use per environment, the way we do this is fill in the Default Value, and the Current Value.
Default Value = The values we want for the other environment, in this case Test Current Value = The values we want to use for this environment, in this case Dev
Once the values are pasted in click ‘Save’
Creating The Flow
I will be creating the Flow inside the solution that the Environment Variable is in from above. Inside the solution click ‘New’ > Flow
For the demo I will use a Manual trigger, than add Two(2) ‘Initialize Variable’ actions
Initialize variable – Schema Name Name: schemaName Type: String Value: Put the Name of the environment variable in the value (This can be found on the main screen of the solution under Name column)
Initialize variable – Parameters Name: parameters Type: String Value: Leave blank for now, this variable will store either the Default Value or Current Value, based on which environment we are in
Next add a ‘Scope’ action to hold all the actions that will get the parameters
I renamed my ‘Scope’ to Load Parameters.
NOTE: You can copy and paste my filter query as long as you kept the same name as I did. When you see the @ in Power Automate, this allows you to call expressions without having to go into the expression tab. If you want to build the expression yourself, I will include the syntax under each picture of the List Records actions and on
Inside the Scope, add Two(2) ‘Common Data Service Current Environment – List Records’ actions.
1) List records – Parameter Definitions Entity name: Environment Variable Definitions Filter Query: schemaname eq ‘@{variables(‘schemaName’)}’
schemaname eq ‘YOUR ENVIRONMENT VARIABLE NAME’
2) List records – Parameter Current Values Entity name: Environment Variable Values Filter Query: environmentvariabledefinitionid_value eq ‘@{first(outputs(‘List_records-_Parameter_Definitions’)?[‘body/value’])?[‘environmentvariabledefinitionid’]}’
Under the ‘If condition‘ add a ‘Parse JSON‘ Name: Parse JSON – Parameters Content: @{variables(‘parameters’)} Schema: To generate your schema, Click the ‘Generate from sample’ button, than paste in the JSON that you used for Default Value
We are done with the parameter scope now..
Using The Parameters
I will be adding a Teams Action ‘Post a message‘ I will use the dynamic content from my ‘Parse JSON‘ action.
Triggering the Flow, I expect the Dev values to be used (Current Value)
Here is the Teams message:
Next we will export and import into a different Environment which will use different parameters (Default Value)
Export And Import – Deployment
Overview of this sections steps: 1. Remove Current Value from Environment variable 2. Publish Solution 3. Export Solution 4. Import Solution 5. Authorize any Connections 6. Enable Flow 7. Trigger / Test
Remove Current Value from Environment variable
Inside the Solution click on the Environment variable, under Current Value click the 3 dots ( . . . ) Select Remove from this solution
This will only remove the values from this solution. The Current Values will still be in CDS and can also be added back into this solution if needed by clicking Add existing under Current Value
Publish Solution
Inside your solution, click ‘Publish all customization’
Export Solution
Once published click ‘Export’
Export as ‘Managed’ or ‘Unmanaged’ Choose either or based on your needs.
Import Solution
Switch to your other Environment, and click Solutions tab. Click ‘Import’
Choose your Zip file from your Export, and follow instructions by clicking ‘Next’
Authorize any Connections
Once the solution is imported successfully, you may need to authorize any connections inside the Flow. This can be done by clicking on the Flow from inside your solution, and clicking ‘Sign in’ on all actions > Click ‘Save’
Enable Flow
Now enable / turn on the Flow
Trigger / Test
Trigger the Flow to confirm the values are coming over as correct (Default Value).
Test Env – Using Default Value as expected
Now in Teams our message has been posted in a different team chat, different channel, and with the ‘Test’ text in the subject line
Remember Virus Total? Now you can integrate it with Power Automate to give real time URL and file analysis.
Virus Total in Power Automate. Now we can scan links or files and generate a report, right in Power Automate. Some examples may include: Links or files from Emails, Teams, Etc.
What is Virus Total
Virus Total is a free and powerful tool to scan Files, and Links. Virus Total uses the Hash of the File/URL and checks some of the most popular antivirus systems to generate a report. https://www.virustotal.com/
Prerequisites
This is a Premium connector Note: These actions of the time of this blog, are in Preview.
Virus total has two types of API Free (Public): – The Public API is limited to 4 requests per minute and 1K requests per day. – The Public API must not be used in commercial products or services. – The Public API must not be used in business workflows that do not contribute new files. Paid (Premium): – The Premium API does not have request rate or daily allowance limitations, limits are governed by your licensed service step. – The Premium API returns more threat data and exposes more endpoints and functionality. – The Premium API is governed by an SLA that guarantees readiness of data.
Keep the above information in mind when using the API
To use the Virus Total connector, you must sign up on their site and get a token. To get the token, follow these steps:
First head over to https://www.virustotal.com/ and sign up for free Next you will have to confirm your email address
Now once you can login to your account, you want to click your person logo in the top right, and select API key
Now copy the API key that you are given. That is it! Now you can use that Key to create a connection with the Virus Total Connector is Power Automate
Connection Setup
First, we make a connection to the Virus Total API. In your Flow, add a new action, search for Virus Total.
Virus Total has a couple Actions here that are very powerful. We will be using ‘Analyse an URL’ action for this demo.
All that is needed to create the connection is your API key from the prerequisites.
Connection name can be anything you want, for this demo I chose VirusTotalDemo
Now that we have the connection established we can build the Logic for analyzing a URL.
Building the Flow
I will be using a Button trigger, and a Variable to store the URL I want to analyze but I will go through some use cases at the end of this blog of how this can be implemented.
There are two main actions in the Virus Total connector I will be using: – Analyse an URL – Retrieve information about a file or URL analysis
I am using a string variable to store the URL. Now we use the Virus total action called: Analyse an URL. This action only needs one input, the URL we want to analyse. This action outputs the ‘id’ for the analysis. We can use this ‘id’ in our next action.
Now we add the second Virus Total action called: Retrieve information about a file or URL analysis. This action wants the ‘id’ from the first step.
From here we get a bunch of cool dynamic content for the stats of this URL.. But for this demo, I will use the ‘stats’ dynamic content, this is a JSON object so I will add a Parse JSON action. To get the schema, you can either copy my schema, or use {} inside your schema, and run the flow, than copy the outputs to ‘Generate from sample’
From parsing the ‘stats’ object, I am able to check if the URL has any harmless, malicious, suspicious, or undetected reports. This information can be very useful.
Now I can add a If condition and control what kind of sites I want to classify as harmful or malicious. Here is my condition:
If URL has 3 or more report engines pick up the URL as Malicious, OR if the report has less than 50 harmless reports. That I am classifying this URL as BAD
Use Cases / Conclusion
I have only scratched the surface with Virus Total in this blog. I am sure lots of people can find a great use out of this. A great example of how this can be used: Have users send files or URLS, which can get analyzed, to help with cyber security. Could trigger by: – Flow bot in Teams – Have a flow monitor a certain email, which can parse the body for URLS, and check for attachments
Expressions can be confusing when starting out in Power Automate. Luckily the product team makes things easier each day. I will be showing how to grab an email address Josh.Cook@flowaltdelete.ca and transforming it to Josh Cook
Expressions can be confusing when starting out in Power Automate. Luckily the product team makes things easier each day. I will be showing how to grab an email address Josh.Cook@flowaltdelete.ca and transforming it to Josh Cook
The Scenario
For this demo, we will be formatting an email address and removing everything after the ‘@‘ symbol, to form a first name and last name.
We will be using Substring to achieve this. However we wont be using the expression Substring, we will be using a Action called Substring, this action can be found in the ‘Text Functions‘ connector.
The Flow
In my Flow I will use a Compose action to store an email address, this can be any data source or action that fits your needs.
In this example, we want to remove the At sign and everything after. To do this we could use an expression. But.. The Power Automate team has put together some actions that make Text Functions more easy.
At this time there is Two(2) Text Function action we can utilize. 1. Find text position 2. Substring We will you both in this example
First we will add the ‘Find text position‘ action. This action has Two(2) parameters to fill in: Text – The text we want to use, in this case we use the dynamic content of our Compose with the email Search text – This is the text we want to find the position of
In the string Josh.Cook@flowaltdelete.ca the ‘@’ sign is in position 9. This is because in indexing values we count the first value as index 0
Next we add the ‘Substring‘ action. This action has three(3) parameters. 1. Text – The string of text to extract the substring from 2. Starting position – since we want to extract ‘Josh.Cook‘ our position will be 0 3. Length – This is how long the substring will be, so this is where we use the dynamic value from Text Position action (Which is 9)
Now when we run the Flow, we should take the value: ‘Josh.Cook@flowaltdelete.ca‘ And output ‘Josh.Cook‘
Mhm.. Not bad, now all that is needed is getting rid of the ‘.’ This can easily be done by using the replace() expression. replace(<Dynamic Substring Body>,’.’,’ ‘) The replace expression is replacing the ‘.’ with a white space char
replace(outputs('Substring')?['body'],'.',' ')
Now when we run the flow:
Conclusion
These new Text Function actions in Power Automate makes expressions even easier. I cannot wait to see what the Product group adds to these. Thanks for reading, and as always please reach out to me on Twitter with any questions you may have. Thanks again!