Category: Power Platform

  • How to Use Regular Expressions in Microsoft Power Virtual Agents With Examples

    How to Use Regular Expressions in Microsoft Power Virtual Agents With Examples

    Regular Expressions in Power Virtual Agents? Sounds like a pretty advanced topic. But it’s actually not that difficult and can save you hours of time if you’re trying to validate user input for things such as credit card numbers, tracking IDs, custom invoice numbers or even IP addresses. In this post we’ll cover some of the basics of Regular Expression syntax so you can get started using them inside Power Virtual Agents.

    Summary

    To utilize regular expressions inside Power Virtual Agents, we must first create a new entity.
    This can be be done by clicking the Entities tab > New entity.

    Now select Regular expression (Regex)

    PVA does a great job in providing some general use case examples.

    The syntax is based on .NET


    RegEx Examples in PVA

    Below you will find some examples you can copy and paste directly into the Pattern for your Regular Expression:

    PatternDescription
    ^(?:(?:25[0-5]|2[0-4][0-9]|[01]?[0-9][0-9]?).){3}(?:25[0-5]|2[0-4][0-9]|[01]?[0-9][0-9]?)$IP Address
    – Looks for X.X.X.X format
    – Each X range in 0-255
    – X length 0-3
    ^4[0-9]{12}(?:[0-9]{3})?$Visa Credit Card numbers
    – Start with a 4
    – Old cards use 13 digits
    – New cards use 16 digits
    ^3[47][0-9]{13}$American Express
    – Starts with 34 OR 37
    – All have 15 digits
     ^(?:5[1-5][0-9]{2}|222[1-9]|22[3-9][0-9]|2[3-6][0-9]{2}|27[01][0-9]|2720)[0-9]{12}$Mastercard
    Starts with either:
    51-55 OR 2221-2720
    – All have 16 digits
    ^(?!0{3})(?!6{3})[0-8]\d{2}-(?!0{2})\d{2}-(?!0{4})\d{4}$Social Security Number
    – SSN are 9 digits
    – Looks for XXX-XX-XXXX format
    – Cannot contain all zeros
    – Cannot begin with 666 OR 900-999
    ^[a-fA-F0-9]{2}(:[a-fA-F0-9]{2}){5}$Mac Address
    – 6 byte hex separated by colon “:” OR dash “-”
    ^((6553[0-5])|(655[0-2][0-9])|(65[0-4][0-9]{2})|(6[0-4][0-9]{3})|([1-5][0-9]{4})|([0-5]{0,5})|([0-9]{1,4}))$Port Number
    – Matches valid port number in computer network
    – 16 bit
    – Ranges from 0-65535
    [A-Z]{2,}-\d+Jira Ticket Number
    – Looks for format Hyphen-separated Jira project key and ticket issue number
    ^(bc1|[13])[a-zA-HJ-NP-Z0-9]{25,39}$Bitcoin Address
    26-35 alphanumeric characters
    – Start with 1 OR 3 OR bc1
    ^[0-9a-fA-F]{8}\b-[0-9a-fA-F]{4}\b-[0-9a-fA-F]{4}\b-[0-9a-fA-F]{4}\b-[0-9a-fA-F]{12}$UUID / {guid}
    – 36 characters
    – 128 bit, represented in 16 octets
    – Looks for format form of  8-4-4-4-12

    Using them in PVA

    Once we create the Entity, and define the pattern for our RegEx. We can now use this validation inside our PVA chat.

    For example, I will test the IP Address pattern

    ^(?:(?:25[0-5]|2[0-4][0-9]|[01]?[0-9][0-9]?).){3}(?:25[0-5]|2[0-4][0-9]|[01]?[0-9][0-9]?)$
    

    I have created a topic for testing my RegEx.

    To use the newly created entity, Add a Question, under Identify select your Custom Entity

    Under this, I add a message to confirm its valid.
    (Note, the bot will automatically let the user know if the validation does not match)

    Testing the RegEx

    Okay, drumroll….
    The values I will be testing are

    User InputValid?
    192.168.1.1Valid ✔
    127.0.0.1Valid ✔
    999.55.1.5Not Valid ✖
    Not A IP AddressNot Valid ✖

    Conclusion

    Being able to use Regular Expressions inside Power Virtual Agents can be extremely powerful. And with the above list of common patterns, I hope you find value in this post.

    Thank you, and have a great day!

  • Power Apps Choosing Which Connections To Use Using Power Automate

    Power Apps Choosing Which Connections To Use Using Power Automate

    You may have run into an issue when creating Power Apps that needs to submit data to SharePoint, Dataverse, etc. But did not want to give everyone in the app access to these.
    The problem is, Power Apps uses the connections of the user using the app, meaning if the app writes to a SharePoint List, the user will need Read/Write access.
    The same goes for Power Automate if we try to send the data to Power Automate from Power Apps, it still uses the users connection who triggered the Flow.
    How can we get around this? Read below!

    Table of Contents


    Known Issues

    1. If you block the HTTP Request connector via data loss prevention (DLP), child flows are also blocked because child flows are implemented using the HTTP connector. Work is underway to separate DLP enforcement for child flows so that they are treated like other cloud flows.
    2. You must create the parent flow and all child flows directly in the same solution. If you import a flow into a solution, you will get unexpected results.
      Call Child Flows – Power Automate | Microsoft Docs

    Prerequisites

    1. The Flows must be created inside the same Solution, so a Dataverse database must be configured on the Power Platform Environment

    The Scenario

    In this scenario, I will be showing how a user can use Power Apps to create items in a SharePoint List without being a member of the Site. This will allow us to use a specific Service Account to create the data in SharePoint without giving the user in the app any permission at all!

    First we will build the Child Flow, then Parent Flow, and lastly customize the Power App

    Child Flow

    Inside your Solution create a new Cloud Flow.

    1. For our trigger we use a Manual Button, and add the data we are expecting from Power Apps to put inside our SharePoint List
      (In my example I am only bringing in one field for Title)
    2. Next, I add a Create Item action for my SharePoint List, and add the Parameters from the trigger inside the action.
    3. Lastly, I add a ‘Respond to PowerApp or flow’ action, I create an Output called Success, and some details about what was created.
    Child Flow


    Make sure to use the Connection you want users of the App to use for the SharePoint Create item action.

    Save and go back to the Flow dashboard screen (where you see the Details and run history screen).

    There will be a Card on the right side called ‘Run only users’ click Edit

    Run only users

    Under Connections Used, switch ‘Provided by run-only user’ to the connection you want to be used by users of the App
    (They wont have access to this Connection outside this Flow)

    Run only user

    Click Save,

    Now onto the Parent Flow

    Parent Flow

    Go back to the Solution and Create another Cloud Flow.

    1. For our trigger we use the PowerApps button trigger.
    2. As best practice, create Variables for your data that is coming from Power Apps. Don’t forget to name them, as this will be the parameter name in Power Apps,
      Use the ‘Ask in PowerApps‘ dynamic content for your variable values.
    3. Next we use a action called ‘Run a Child Flow’
      (If you do not see this action, your Flow was not created inside a Solution)
      Add the parameters (these were the input parameters from the last Flow that we just created).
    4. Lastly, add ‘Respond to a PowerApp or flow’ action. For this demo I am adding the parameter ‘Success’ this is from the child Flow.


    Click Save.

    Power App

    Now onto the Power App, I am going to create a simple Power App with 1 TextInput for Title, and a Button to Pass the data to Power Automate.
    Here are my controls for reference:

    TextInput_Title
    Button_SendToFlow

    For the Button:
    1. Add the Flow to the button by clicking on the Button,
    2. Clicking Action tab on top of page,
    3. Clicking Power Automate
    4. Select the Flow


    Next add the parameters for the Flow, in my case I am adding the TextInput_Title.Text

    Now, I want to add a Notification that the Item has been added, which will confirm my Flow has Run correctly. Ill be using the ‘Success’ Output parameter from the Flow for this.

    To add this, I put my Flow run inside a Variable inside Power Apps. Ill call my variable Results, and IO add this to the OnSelect property of the Button where my Flow is:

    Now I use the ‘Notify’ function to notify the user of the item being created, I add this after the semicolon. So my function looks like this in the end:


    So my final code looks like this:

    Set(
        Results,
        'PA-Trigger1'.Run(TextInput_Title.Text)
    );
    Notify(
        Results.success,
        NotificationType.Success
    );
    Reset(TextInput_Title)
    

    Now lets test it!

    Conclusion

    I am using a User called ‘Demo User’ I have shared the App with this user. But they are not part of the SharePoint Site


    Here is the SharePoint Site:

    Now Logged in as the Demo User to test this:

    Logged in as Demo User


    Button Clicked >

    Button Pressed, Flow Completed

    Now to check SharePoint >

    Test Success!!

    Done!
    So this was just a basic example on how we can create data inside a Data Source that the user of the App does not need access too.

  • Check Conditions In Power Automate During Run

    Check Conditions In Power Automate During Run

    The Problem?

    Is your Condition not working as expected?
    The problem is when we use a Condition action inside Power Automate, we cannot see the “equation” that is being evaluated when looking into the run.

    The problem affects how we can troubleshoot, the following solution will show what is happening inside the Condition action during the run.

    Scenario

    In this scenario, I am checking:
    If one value is greater than a second value

    Now during a test run, I expect to see this condition true, but in my run it is always showing false and going in the If no branch.

    The big problem is though, I cannot see what the values being evaluated look like. Take a look below

    Clicking on the “Show raw inputs” is also not helpful..

    Solution

    So what is this quick and easy solution to see the condition results? A simple ‘Compose‘ action.

    Lets take a look:
    First add a Compose under your Condition

    Next copy the values that are in the Condition to the Compose.
    My Compose now looks like this:

    Now make sure the Compose is above your Condition.
    I am just dragging the Condition below the Compose

    Next, we can run the Flow again, and see what the Compose can tell us:

    Yikes! We can see our 2 values that are being evaluated are both 15.
    And 15 is not greater than 15. This is why its returning false.

    My Thoughts

    In my opinion, this should be already visible inside the Condition action. To get this feature added to Power Automate, we can vote on this feature. Head over to the Community Forum and vote for this idea.

    View details of Condition results in runs – Power Platform Community (microsoft.com)

    The more votes, the better the chances of the Product team implementing this.
    Thank you for reading, and have a great day!

  • Checking If HTML Table Is Empty In Power Automate

    Checking If HTML Table Is Empty In Power Automate

    The Problem

    I needed to check if an HTML table had data or not. Usually when I need to check I have two expressions I go to first.

    1. empty()
    2. length()

    I tried using empty() and found that the HTML table even when empty, is not truly empty.
    I then tried length() and found that when the HTML table is empty there is still a length of 30.

    The Scenario

    I have some data that is used to track different devices that can be loaned out. The data has properties like, Type of device, Serial Number, Etc.

    The data comes in, and looks like this:

    [
      {
        "type": "Phone",
        "device": "iPhone 11 Pro",
        "serialNumber": "0007488"
      },
      {
        "type": "Phone",
        "device": "Samsung Galaxy S20",
        "serialNumber": "1166289"
      },
      {
        "type": "Watch",
        "device": "Apple Watch Series 5",
        "serialNumber": "00013701"
      },
      {
        "type": "Laptop/Tablet",
        "device": "Surface Pro X",
        "serialNumber": "AA78442"
      }
    ]

    I want to put this array of data inside a HTML table and send it out on an email. The problem is, my data might be empty, as only available devices will show up in my data.

    I need to check if the HTML table is empty, if it is empty:
    If True:
    Send email with HTML table
    If False:
    Send email without HTML table

    The Flow

    For this Flow, I will be using an Array Variable to simulate my data coming in from another system.
    I will call this Variable ‘Data‘.
    The HTML table action will be added underneath.
    You will need to determine if you want to use ‘Custom columns‘ or ‘Automatic columns‘ This can be done in the advanced options in the HTML action:

    My ‘Data‘ Variable is empty at the moment. This is what we want for our first run, we want to get the length of the HTML table when its empty.

    Next add a ‘Compose‘ action, and use the expression length(), pass in the HTML table as the parameter. For example, my expression looks like:

    length(body('Create_HTML_table'))
    

    Now run the Flow with no Data in the HTML table, and check your Compose action to see what the length is. In my case it is 30

    Now we can add a If Condition to check if the length is greater than 30

    ** TIP **
    I am passing in the Compose action into the condition, this allows me to see what the outputs of the Compose action before it gets evaluated inside the condition. This is extremely useful for troubleshooting

    Conclusion

    The Flow will go into the ‘If yes’ block if the HTML table has data

    The Flow will go into the ‘If no’ block if the HTML table is empty

    Of course checking the Data Variable itself for length could work way better. This example is mainly for data that can come in that could have loads of junk. For example:
    An HTTP API could bring in no data, but still have other information attached like, headers, status code, version. In this case we can only do conditional checks on the HTML table, since our Data variable will always have something being passed in.

    I used this method to help someone on the Community Forum, check it out here:
    https://powerusers.microsoft.com/t5/Building-Flows/Create-a-Flow-with-Condition-that-does-not-send-email-when-list/m-p/721076/highlight/false#M98488

  • Getting Specific Files And IDs In SharePoint Using Power Automate

    Getting Specific Files And IDs In SharePoint Using Power Automate

    Contents

    This post will go over a common problem I have seen using SharePoint action to get a specific file ID.
    There are three parts to this post:
    The Problem
    The Solution
    Conclusion

    The Problem?

    I encountered an issue when trying to filter a a file by filename, that was in a SharePoint document library.

    When needing to get a specific SharePoint file ID. It can be troublesome to filter the Files. For example Using a ‘Get files‘ action we can see that the properties of the file are encased inside {} meaning that SharePoint is using some calculation on the Document Library to create these fields.

    Attempting to use a filter query on any of the fields encased between {} returns an error. For example I am trying to filter the file called ‘Readthis.txt‘ I get a ‘Bad Request’ Error:

    The error message I receive is:

    Column ‘Name’ does not exist. It may have been deleted by another user.
    clientRequestId: 75462fg8-686d-48p1-5se7-e8l97jk84025
    serviceRequestId: 75462fg8-686d-48p1-5se7-e8l97jk84025

    I have read online that using Title as the column name is correct, although I do not get an error, the output is empty.

    The Solution

    Now the best part, the solution!

    The way I have been using to filter out ‘Get files‘ action is by using the ‘Filter array‘ action. This action will allow us to filter the calculated columns that SharePoint is using, like {Name}, {Link}, {FilenameWithExtension}, Etc.

    To get setup, we want the ‘Get files‘ action to pull in ALL files, so we don’t want to have any filter at this stage.

    Now add a ‘Filter array‘ action, put the dynamic content value in the From field. On the left side select the dynamic content from the ‘Get files‘ action. The right side put what you want to filter on.
    So for example, I want to filter and get the file ‘Readthis.txt‘. So my ‘Filter array‘ action looks like this:

    Now when running the Flow, the ‘Filter array’ action is properly filtering out the filename:

    Conclusion

    I wrote this blog post based on a scenario I have helped solved on the Power Automate Community Forum.

    Hopefully someone else find this information useful.

    Thanks for reading!

  • Converting Time Zones Easily In Power Automate

    Converting Time Zones Easily In Power Automate

    Summary

    Did you know that Power Automate has a Date Time action that can easily convert, and format time zones in one action?
    Why is this important? Power Automate natively uses UTC as its time zone, as well as most SharePoint sites. Using an action can be easier than using expressions.

    The Flow

    In this example, we will want to get the current time (this will be in UTC since we will be using Power Automate) and converting the time to local time with a specific format.

    First we want to get the current time, we can use the expression utcNow() but I will be showing how to use the Date Time actions instead.

    The actions are under Date Time:

    Add a Current time action, this action is the same as using utcNow() expression

    Next add Convert time zone action, this action is very useful as it has pre loaded time zones and formats to choose from.

    The inputs for this action are:
    Base time: Use the output from the Current time action
    Source time zone: Make sure to select Coordinated Universal Time
    Destination time zone: Select your local time zone or the time zone you want
    Format string: This dropdown has many ISO formats to choose from. If you want to have a custom format, simply click the drop down and select Enter custom value. See below for examples

    Format Examples

    If for some reason the format you want is not in the dropdown for formats, you can create a custom format as long as it follows ISO 8601 format. To add a custom format click Enter custom value in the dropdown

    Some tips when creating a format for the date ‘2020-10-13‘ (October 13 2020)
    yy = 20
    yyyy = 2020

    MM = 10
    MMM = Oct
    MMMM = October

    dd = 13
    ddd = Tue
    dddd = Tuesday

    Examples:

    yyyy-MMM-ddd = 2020-Oct-Tue
    yy/MMMM/dddd = 20/October/Tuesday
    dddd, MMMM, yyyy = Tuesday, October, 2020
    MMMM dd, yyyy = October 13, 2020
    yyyy-MM-dd = 2020-10-13 (used for comparing dates)

    To add time to your format use the following in your format:
    (It is best practice to add the letter ‘T’ before using time formats)

    h = hours (12 hour time)
    hh = hours (12 hour time)
    HH = hours (24 hour time)
    mm = minutes
    ss = seconds
    tt = Appends either AM or PM to time

    Some examples are:
    MMMM dd, yyyyThh:mm = October 13, 2020T12:51
    MMMM/dd/yyyyTHH:mm:ss = October/13/2020T13:02:41
    hh:mm:ss tt = 01:06:41 PM
    h:mm:ss tt = 12:06:41 PM

    Conclusion

    Knowing these formats and the what each letter code does, the possibilities are endless. You can create any type of custom date time format easily.

    As always if you have any questions, don’t hesitate to reach out.

    Thank you for reading!

  • Power Automate – Limit Runs to Trigger One at a Time

    Power Automate – Limit Runs to Trigger One at a Time

    Summary

    Controlling when your Flow triggers can be crucial. By default Flows run in Parallel, this means multiple runs could be running at the same time, this is great for performance, but could cause troubles in some Flows. For example:

    Lets say we have a Flow setup that is triggered on when an item in a SharePoint list is created, which gets sent to other systems for different purposes. For data quality reasons, we only want the Flow to run one at a time, and have other runs be waiting in a queue.

    The Flow

    The only setting we need to change is in our Trigger.

    This can be done on Triggers other than SharePoint

    For this demo, I added a Compose action to get the Name of the item being created in SharePoint.

    I added a Delay action only to show what happens if multiple runs queue up.

    The setting we want to change is in the trigger, click the 3 dots on the trigger and select Settings from the drop down.

    Now inside Settings, find and enable Concurrency Control, and set Degree of Parallelism to 1. This setting is how many runs can run in at one time. Click Done

    My trigger is When an item is created, so I will create 3 items, one every 15 seconds to show what happens with queued runs.
    1st item = Run1
    2nd item = Run2
    3rd item = Run3

    Here are my findings:

    As we can see the runs are not in order, of the way I created the items. So we can conclude that if we create the items faster than our trigger time, we can expect that the Flow will not run in a sequential order.

    Conclusion

    As we can see above, this Trigger setting is very useful when needing to limit the Flow to run one at a time.
    But the limitation on this is, if the Flow is being triggered multiple times and having the runs queue up, there is a chance that the runs will not run in order.

  • Using Environment Variables as Parameters for Power Automate Deployments (ALM)

    Using Environment Variables as Parameters for Power Automate Deployments (ALM)

    Summary

    Deploying Power Automate Flows can be a headache if you have to manually change values inside the Flow for each environment. You also run the risk of missing a value.

    This post will go in depth on using Environment variables inside solutions to allow certain values to be used in different environments.
    I will be using Two(2) environments for this demo:
    Dev, and Test

    This demo will utilize the data type ‘JSON’, this will save loads of time.

    Terms / Glossary

    Here are some familiar terms that will help in this post:
    Default Value = The value that you’re expecting after you deploy the Flow.
    This will be our Test environment values

    Current Value = The value that overrides the Default Value.
    This will be our Dev environment values

    Parameters = These are just values. For example 2 + 2. Each 2 is a parameter of the addition expression (+)

    ALM = Application Lifecycle Management
    Documentation on ALM

    Contents

    Prerequisites
    The Scenario
    Getting Parameter Values
    Creating Environment Variables
    Creating The Flow
    Using The Parameters
    Export And Import Deployment
    Conclusion

    Prerequisites

    • Access to Common Data Service
    • Access to create, export, and import Solutions

    ** Note: I Have created this guide to be as simple as possible, If you name everything as I do, you will be able to copy and paste all my expressions directly into your Flow **

    The Scenario

    My Flow posts a message into Microsoft Teams group. When deploying the Flow into different environments, I want to change the Teams content like:

    • Team Name
    • Channel ID
    • Subject

    Usually after deploying, we can go in to the Flow and manually change the values. This can cause errors, especially if we forget to change some values after deploying (I may have done this a few times).

    Getting Parameter Values

    It is important to note that not all Action values can be parameterized. Follow the steps below to see if the values can be parameterized:

    Teams Example:
    My Teams action, I want to parameterize the Team Name, Channel ID, and Subject. For this I add the Action, and select the values as needed.

    Now with the information filled in, click the 3 dots ‘. . .’ on the action, and click ‘Peek Code’.

    Dev Parameters

    In the ‘Peek code’ view, we can see the different parameters and the values that this action uses in the background. Copy these values into notepad or code editor for use later. I have created a simple JSON to be placed in my Environment Variable later on.

    I will be using the Env value for my Subject in the teams message

    For example:
    Team = 1861402b-5d49-4850-a54b-5eda568e7e8b
    Channel = 19:be3759762df64d94a509938aa9962b29@thread.tacv2
    Subject = Message From Dev Environment

    To test that we can use custom values as parameters, we want to grab these values from above and insert them into the ‘Teams Post a message’ action as custom value, than run the Flow

    Mine looks like this now:

    Now run the Flow to make sure everything works as expected using the Custom values

    Now that we know Custom values work for the inputs/parameters for the action. We now want to get the values for the other environment. Remove the custom values from the inputs and choose the correct value that we want to point to when this Flow is deployed to another environment. For example:

    Again we do a ‘Peek code’ to get the parameter IDs that this action uses

    Test Parameters

    Copy these values into notepad or a code editor for use later. Now we have two sets of values, one for Dev, and one for Test. I have created a simple JSON to be placed in my Environment Variable later on.

    I will be using the Env value for my Subject in the teams message

    Make sure the Two(2) JSONs have the same structure. We will be using the data to populate the Environment Variables in the following steps

    Creating Environment Variables

    Environment variables can only be created inside a solution, there are multiple ways you or your organization may want this set up.

    In this demo I have created a Publisher in CDS called ‘param’, this will better define the parameters that we create inside the solution. (this is optional) The default CDS publisher could also be used.

    Create a solution inside Power Automate.
    Click Solutions on the left navigation menu,
    Than click ‘New Solution’ on the top on menu, and fill the information out

    Once the solution is created,
    Click ‘New’ > Environment variable

    Now fill in the information like the screenshot below.

    Note, I will be using the Data Type as JSON. I use this for simplicity, as I have more than one value I want to add. Also we can use the Parse JSON action in Flow to reference each value separately. You can use any Data Type you like

    Now we populate the values we want to use per environment, the way we do this is fill in the Default Value, and the Current Value.


    Default Value = The values we want for the other environment, in this case Test
    Current Value = The values we want to use for this environment, in this case Dev

    Once the values are pasted in click ‘Save’

    Creating The Flow

    I will be creating the Flow inside the solution that the Environment Variable is in from above.
    Inside the solution click ‘New’ > Flow

    For the demo I will use a Manual trigger, than add Two(2) ‘Initialize Variable’ actions

    Initialize variable – Schema Name
    Name: schemaName
    Type: String
    Value: Put the Name of the environment variable in the value (This can be found on the main screen of the solution under Name column)

    Initialize variable – Parameters
    Name: parameters
    Type: String
    Value: Leave blank for now, this variable will store either the Default Value or Current Value, based on which environment we are in

    Next add a ‘Scope’ action to hold all the actions that will get the parameters

    I renamed my ‘Scope’ to Load Parameters.

    NOTE: You can copy and paste my filter query as long as you kept the same name as I did. When you see the @ in Power Automate, this allows you to call expressions without having to go into the expression tab. If you want to build the expression yourself, I will include the syntax under each picture of the List Records actions and on

    Inside the Scope, add Two(2) ‘Common Data Service Current Environment – List Records’ actions.

    1) List records – Parameter Definitions
    Entity name: Environment Variable Definitions
    Filter Query: schemaname eq ‘@{variables(‘schemaName’)}’

    schemaname eq ‘YOUR ENVIRONMENT VARIABLE NAME’

    2) List records – Parameter Current Values
    Entity name: Environment Variable Values
    Filter Query: environmentvariabledefinitionid_value eq ‘@{first(outputs(‘List_records-_Parameter_Definitions’)?[‘body/value’])?[‘environmentvariabledefinitionid’]}’

    @{first(outputs('List_records-_Parameter_Definitions')?['body/value'])?['environmentvariabledefinitionid']}
    first(outputs(‘List_records-_Parameter_Definitions’)?[‘body/value’])?[‘environmentvariabledefinitionid’]

    Now we need to check which value to use, the Default Value, or the Current Value.

    Add an ‘If Condition‘ Build the condition like this:

    If Current Value is empty
    Left Value: @first(outputs(‘List_records_-_Parameter_Current_Values’)?[‘body/value’])?[‘Value’]

    @first(outputs('List_records_-_Parameter_Current_Values')?['body/value'])?['Value']

    is equal to
    Right Value: @null

    @null
    first(outputs(‘List_records_-_Parameter_Current_Values’)?[‘body/value’])?[‘Value’]

    Next in the ‘If yes‘ block add a ‘Set Variable

    Set variable – Parameter Default
    Name: parameters
    Value: @{outputs(‘List_records_-_Parameter_Definitions’)?[‘body/value’][0][‘defaultvalue’]}

    @{outputs('List_records_-_Parameter_Definitions')?['body/value'][0]['defaultvalue']}
    outputs(‘List_records_-_Parameter_Definitions’)?[‘body/value’][0][‘defaultvalue’]

    In the ‘If no‘ block add a ‘Set variable

    Set variable – Parameter Current
    Name: parameters
    Value: @{outputs(‘List_records_-_Parameter_Current_Values’)?[‘body/value’][0][‘Value’]}

    @{outputs('List_records_-_Parameter_Current_Values')?['body/value'][0]['Value']}
    outputs(‘List_records_-_Parameter_Current_Values’)?[‘body/value’][0][‘Value’]

    Under the ‘If condition‘ add a ‘Parse JSON
    Name: Parse JSON – Parameters
    Content: @{variables(‘parameters’)}
    Schema: To generate your schema, Click the ‘Generate from sample’ button, than paste in the JSON that you used for Default Value

    We are done with the parameter scope now..

    Using The Parameters

    I will be adding a Teams Action ‘Post a message‘ I will use the dynamic content from my ‘Parse JSON‘ action.

    Triggering the Flow, I expect the Dev values to be used (Current Value)

    Here is the Teams message:


    Next we will export and import into a different Environment which will use different parameters (Default Value)

    Export And Import – Deployment

    Overview of this sections steps:
    1. Remove Current Value from Environment variable
    2. Publish Solution
    3. Export Solution
    4. Import Solution
    5. Authorize any Connections
    6. Enable Flow
    7. Trigger / Test

    1. Remove Current Value from Environment variable

    Inside the Solution click on the Environment variable, under Current Value click the 3 dots ( . . . ) Select Remove from this solution

    This will only remove the values from this solution. The Current Values will still be in CDS and can also be added back into this solution if needed by clicking Add existing under Current Value
    1. Publish Solution

    Inside your solution, click ‘Publish all customization’

    1. Export Solution

    Once published click ‘Export’

    Export as ‘Managed’ or ‘Unmanaged’ Choose either or based on your needs.

    1. Import Solution

    Switch to your other Environment, and click Solutions tab. Click ‘Import’

    Choose your Zip file from your Export, and follow instructions by clicking ‘Next’

    1. Authorize any Connections

    Once the solution is imported successfully, you may need to authorize any connections inside the Flow. This can be done by clicking on the Flow from inside your solution, and clicking ‘Sign in’ on all actions > Click ‘Save’

    1. Enable Flow

    Now enable / turn on the Flow

    1. Trigger / Test

    Trigger the Flow to confirm the values are coming over as correct (Default Value).

    Test Env – Using Default Value as expected

    Now in Teams our message has been posted in a different team chat, different channel, and with the ‘Test’ text in the subject line

    Conclusion

    After reading this post:
    https://powerapps.microsoft.com/en-us/blog/environment-variables-available-in-preview/
    I wanted to build a step by step guide, that is practical and also beneficial. Since this feature is in ‘Preview’ this could change without notification.

    I hope this guide is able to help you get your ALM for your Power Automate Flows more sustainable.

    Flow Template Download and guide:
    Loading Environment Variables – Template

    Thanks

  • Grabbing Error Message From Failed Run

    Grabbing Error Message From Failed Run

    Summary

    When a Flow fails, sometimes we want to capture and send that message out, to a user, support team, or teams channel. In this demo we will go through the steps to capture any error messages from a failed run.

    Steps

    I have written this blog post on the Power Automate Community:
    https://powerusers.microsoft.com/t5/Power-Automate-Community-Blog/Grabbing-Error-Message-From-Failed-Run/ba-p/666015

    If you can, please like and share my post.

    Stay safe, and have a great day!

  • Power Automate Integrated With Virus Total to Scan Files and Links

    Power Automate Integrated With Virus Total to Scan Files and Links

    Virus Total in Power Automate. Now we can scan links or files and generate a report, right in Power Automate. Some examples may include: Links or files from Emails, Teams, Etc.

    What is Virus Total

    Virus Total is a free and powerful tool to scan Files, and Links. Virus Total uses the Hash of the File/URL and checks some of the most popular antivirus systems to generate a report. https://www.virustotal.com/

    Prerequisites

    This is a Premium connector
    Note: These actions of the time of this blog, are in Preview.

    Virus total has two types of API
    Free (Public):
    – The Public API is limited to 4 requests per minute and 1K requests per day.
    – The Public API must not be used in commercial products or services.
    – The Public API must not be used in business workflows that do not contribute new files.
    Paid (Premium):
    – The Premium API does not have request rate or daily allowance limitations, limits are governed by your licensed service step.
    – The Premium API returns more threat data and exposes more endpoints and functionality.
    – The Premium API is governed by an SLA that guarantees readiness of data.

    Keep the above information in mind when using the API


    To use the Virus Total connector, you must sign up on their site and get a token. To get the token, follow these steps:

    First head over to https://www.virustotal.com/ and sign up for free
    Next you will have to confirm your email address

    Now once you can login to your account, you want to click your person logo in the top right, and select API key

    Now copy the API key that you are given. That is it! Now you can use that Key to create a connection with the Virus Total Connector is Power Automate

    Connection Setup

    First, we make a connection to the Virus Total API.
    In your Flow, add a new action, search for Virus Total.

    Virus Total has a couple Actions here that are very powerful.
    We will be using ‘Analyse an URL’ action for this demo.

    All that is needed to create the connection is your API key from the prerequisites.

    Connection name can be anything you want, for this demo I chose VirusTotalDemo

    Now that we have the connection established we can build the Logic for analyzing a URL.

    Building the Flow

    I will be using a Button trigger, and a Variable to store the URL I want to analyze but I will go through some use cases at the end of this blog of how this can be implemented.

    There are two main actions in the Virus Total connector I will be using:
    – Analyse an URL
    – Retrieve information about a file or URL analysis

    I am using a string variable to store the URL.
    Now we use the Virus total action called: Analyse an URL. This action only needs one input, the URL we want to analyse. This action outputs the ‘id’ for the analysis. We can use this ‘id’ in our next action.

    Now we add the second Virus Total action called: Retrieve information about a file or URL analysis. This action wants the ‘id’ from the first step.

    From here we get a bunch of cool dynamic content for the stats of this URL.. But for this demo, I will use the ‘stats’ dynamic content, this is a JSON object so I will add a Parse JSON action.
    To get the schema, you can either copy my schema, or use {} inside your schema, and run the flow, than copy the outputs to ‘Generate from sample’

    My schema is:

    {
        "type": "object",
        "properties": {
            "harmless": {
                "type": "integer"
            },
            "malicious": {
                "type": "integer"
            },
            "suspicious": {
                "type": "integer"
            },
            "timeout": {
                "type": "integer"
            },
            "undetected": {
                "type": "integer"
            }
        }
    }

    From parsing the ‘stats’ object, I am able to check if the URL has any harmless, malicious, suspicious, or undetected reports. This information can be very useful.

    Now I can add a If condition and control what kind of sites I want to classify as harmful or malicious. Here is my condition:

    If URL has 3 or more report engines pick up the URL as Malicious, OR if the report has less than 50 harmless reports. That I am classifying this URL as BAD

    Use Cases / Conclusion

    I have only scratched the surface with Virus Total in this blog. I am sure lots of people can find a great use out of this. A great example of how this can be used:
    Have users send files or URLS, which can get analyzed, to help with cyber security. Could trigger by:
    – Flow bot in Teams
    – Have a flow monitor a certain email, which can parse the body for URLS, and check for attachments

    Thanks for reading!