Intune App configuration policy – Edge/Chrome URLBlocklist on Android: ‘Expected list value’ error

It’s funny when you are planning to post about a topic, then you encounter an error and then publish about the error instead. This was the case when I was comparing the app configuration policies for Edge and Chrome and came to the ‘Expected list value’ issue when trying to set up the URLBlocklist (sidenote below).

TL;DR
– URLBlocklist should be an array, not a string
– Intune’s ‘Configuration designer’ doesn’t allow modifying the value type, use the JSON editor
– Change the URLBlocklist managedProperty from valueString to valueStringArray type and make sure the value is an array of strings, example:

Hint: keep a copy of the JSON config because Intune will try to compile the configuration against the predefined schema which will result in an empty configuration when attempting to modify the JSON data.

Explained

Managed configurations (formerly ‘application restrictions’) can be deployed along with the Android app, if the application has these settings defined. These configuration items are exposed to the EMM partners by using Google Play APIs [so app configuration is not some Intune magic, it’s the beauty of Android Enterprise]. When you are setting up an app configuration policy (for managed devices) the configuration keys displayed are actually read from the application’s app_restrictions.xml file.

Google Chrome managed configuration items

For Edge and Chrome there is a managedProperty called URLBlocklist aka ‘Block access to a list of URLs’. As you can see the Value type here is string:

In the following example, I’m trying to block my webpage and Facebook via this setting:

Blocklist using configuration designer
Settings opened in JSON editor

Going forward the settings are now deployed, you open edge://policy or chrome://policy and you see the following:

‘Expected list value’ error

At this point we should understand how the setting is configured actually, referring to Chrome Enterprise (link). The URLBlacklist property is a list of strings (array), hence it does not accept a string. Now that we know what data type is needed, we should figure out the name of the data type (I did the Google for you, this is ValueStringArray).

Going back to Intune, open the JSON editor and change the data type and the value:

Before (left) and after (right)

This time, there is no error and the configuration works as expected:

URLBlocklist with valueStringArray data type
Facebook blocked

Unfortunately, when you try to modify the JSON, the settings are cleared so make sure you have a copy of your settings:

JSON configuration when trying to edit the previous settings


Sidenote: URL blocklist can also be specified for Edge using Managed app configuration, but it requires Intune App Protection Policy (APP). There are some rare scenarios where you don’t want to apply or can’t apply APP (eg. dedicated devices without Shared device mode)

Nextcloud with AzureAD Application Proxy

There are certain scenarios where Microsoft’s OneDrive/SharePoint solution is not an option for storing data (eg. data localization restrictions enforced by law). However, if you still want to provide your users with the file sync experience and/or other collaboration features you may have came across Owncloud or Nextcloud as an alternative. But have you considered publsihing it via Azure AD Application Proxy?

In this post, I will install a Nextcloud server on Ubuntu and integrate it with AzureAD and publish with the AzureAD Application Proxy service.

Prerequisites:

  • custom DNS domain
  • certificate for Nextcloud app (eg. nextcloud.f12.hu)
  • VM for Nextcloud server, ubuntu server installer
  • Windows Server VM for Application Proxy connector
  • Azure AD Premium P1 licence (or other bundles including AAD P1)

TL;DR

  • install a new Ubuntu server with nextcloud snap
  • do the basic setup for nextcloud, including https
  • from the App bundles install “SSO & SAML authentication”
  • install Azure AD App Proxy connector on a Windows Server with direct line-of-sight on the Nextcloud server (HTTPS access should be enough)
  • DNS setting (split brain DNS): the Windows Server with the App Proxy connector should resolve the FQDN of the app (eg.: nextcloud.f12.hu) to the private address of the Nextcloud server.
  • on the Application proxy page – “Configure an app” – create the Nextcloud app and configure SAML SSO (there is a very good and detailed post on this by Nate Russel, I will cover the required steps too)
  • configure SAML settings on the Nextcloud server (as per in the previous link or see below)

This post will NOT cover:

  • proper storage configuration for Nextcloud and other design principles
  • user matching with previously configured accounts in the Nextcloud environment
  • preparing certificate
  • Active Directory integration (AzureAD only)

Infrastructure basic setup

I’m not an Ubuntu expert, so I left everything as default/recommended during installation – only thing to mention here is that I chose to install the nextcloud snap. While waiting for installation, I created a DNS entry (nextcloud.f12.hu) pointing to the internal IP address of the Nextcloud server.

When the installation was ready, I navigated to the website (http://<ipaddress of the server>) and created the admin account:

Initialization

Next step is to set up the listener (answer to nextcloud.f12.hu not only by IP).

nano /var/snap/nextcloud/current/nextcloud/config/config.php
config.php trusted_domains

Upload the certificate, key and chain then enable https for nextcloud:

nextcloud.enable-https custom <cert> <key> <chain>
enable-https

Now we are ready:

HTTPS enabled

Now navigate to Apps – App bundles and install “SSO & SAML authentication” package

SAML authentication package download

After that navigate to Settings – SSO & SAML authentication and select “Use built-in SAML authentication”

And stop here, but don’t close the browser window, we will come back soon.

Azure Active Directory configuration

Navigate to portal.azure.com – Azure Active Directory – Application proxy. After downloading the connector and installing it on a Windows Server, the service is now enabled. (Very briefly: this server will be the proxy who will “receive the request” and translate the URL to the internal URL specified below. The trick is that the internal and the external URL is the same (this allows consistent user experience).

So click on Configure an app:

configure an app

On the “Add your own on-premises application” enter the name and the URL values (reminder: internal and external URLs are the same) – and don’t forget to register the CNAME entry in the public DNS zone

Adding the Nextcloud application

One sidenote: using Azure Active Directory Pre Authentication is fine until you are planning to access Nextcloud from browser only. If you need the sync client to work, this needs to be switched to Passthrough (with security considerations in mind).

When ready, click Create then on the same screen you can upload the PFX certificate

certificate upload

Next, take care of assignments – you can either assign it to a set of users (Users and groups tab) or in the Properties tab, you can set the “Assignment required” setting to No

Assignment required-No

Next, click on Single sign-on and select SAML authentication. The Basic SAML Configuration blade should include the Entity ID and the Reply URL(s) with the following values:

Basic SAML configuration

On the SAML certificates blade you can either download the Federation Metadata XML and copy the value if the <X509Certificate> tag as per in the tutorial linked above or you can download the Base64 encoded certificate, open it with notepad and remove the –BEGIN CERTIFICATE–, –END CERTIFICATE– lines and the linebrakes. Either way, this certificate will be needed on the Nextcloud SAML configuration.

SAML Certificate download

Now that we have the certificate, navigate back to the Nextcloud SAML settings and fill in the values as per in the tutorial of Nate

SAML setup

And the final step is to enable SAML:

Enable SAML

Now when we navigate to the Nextcloud server’s URL we will be redirected to the Microsoft login portal and that’s it. (Hint: as it is an Enterprise application you can apply Conditional Access policies to make it more secure, just keep in mind the considerations mentioned above about Passthrough pre-authentication)

Monitor AzureAD App registration expiration with PowerShell (GraphAPI)

There are several methods for monitoring Azure AD App registration expiration (like PowerAutomate or Azure Logic Apps) but these methods require extra licences or an Azure subscription. The PowerShell way is free and it only requires a new registration in AzureAD.

TL;DR

  • Create a new app registration with Microsoft Graph Application.Read.All application permission
  • Add a client secret to the app, copy the secret as it will be used in the script
  • Use the script below, fill the variables $tenantID, $appID, $appSecret, $daysBeforeExpiration
  • The output is a PSCustomObject, it is up to you to process it (the example below converts it to a bordered HTML table which is then sent in email – make sure to correct the parameters when using it)

The script:

$tenantID = ''
$appID = ''
$appSecret = ''
$daysBeforeExpiration = 30

#Ensure TLS1.2 is used
[Net.ServicePointManager]::SecurityProtocol = [Net.SecurityProtocolType]::Tls12

#Get access token
$scope = 'https://graph.microsoft.com/.default'
$oAuthUri = "https://login.microsoftonline.com/$TenantId/oauth2/v2.0/token"
$body = [Ordered] @{
    scope = "$scope"
    client_id = "$appId"
    client_secret = "$appSecret"
    grant_type = 'client_credentials'
}
$response = Invoke-RestMethod -Method Post -Uri $oAuthUri -Body $body -ErrorAction Stop
$aadToken = $response.access_token

#Query app registrations, store it in $applications variable
 $url = 'https://graph.microsoft.com/v1.0/applications'

    $headers = @{ 
    'Content-Type' = 'application/json'
    'Accept' = 'application/json'
    'Authorization' = "Bearer $aadToken" 
    }

$applications = $null
while ($url -ne $null){
 $json_response =  (Invoke-WebRequest -Method Get -Uri $url -Headers $headers -ErrorAction Stop | ConvertFrom-Json) #use -UseBasicParsing if scheduling with 'NT Authority\Network Service'
 $applications += $json_response.value
 $url = $json_response.'@odata.nextLink'
 }

 #for each app select the latest credentials of each credential type
 $apps_Cred_latest = $null
 $apps_Cred_latest = foreach ($app in $applications){
    [pscustomobject]@{
        Name = $app.displayname
        LatestKey = $app.KeyCredentials.enddatetime | sort -Descending | select -First 1 
        LatestPass = $app.PasswordCredentials.enddatetime | sort -Descending | select -First 1 


    }
    $app = $null
}

#select apps that are expiring within the range defined in $daysBeforeExpiration
$expiringApps = $apps_Cred_latest | ? {($_.LatestKey,$_.LatestPass -ne $null)} | ? {[datetime]($_.LatestKey, $_.latestPass | sort -Descending | select -First 1) -le (Get-date).AddDays($daysBeforeExpiration) } 


if ($expiringApps){
$Header = @"
<style>
TABLE {border-width: 1px; border-style: solid; border-color: black; border-collapse: collapse;}
TH {border-width: 1px; padding: 3px; border-style: solid; border-color: black;}
TD {border-width: 1px; padding: 3px; border-style: solid; border-color: black;}
</style>
"@

[string]$html_expiringApps = $expiringApps | convertto-html -Head $Header

Send-MailMessage -From "daniel@f12.hu" -To "reports@f12.hu" -SmtpServer smtp.f12.hu -Port 25 -Subject "Azure AD credentials expiring in $daysBeforeExpiration days" -Body $html_expiringApps -BodyAsHtml -Encoding UTF8
}

Backup AzureAD Conditional Access Policies v2 – Graph API

AzureAD Powershell is planned for deprecation (link) so I redesigned my Conditional Access Policy Backup solution originally posted here. This v2 edition uses an AzureAD app registration for unattended access (eg. scheduled script) and the Microsoft Graph API (but not the Microsoft Graph PowerShell module).

The idea and the logic is the same as in the previous version: Conditional Access Policies can be exported as JSON and if the policy in AzureAD differs from or latest version then we need an “incremental” backup.

TL;DR
– Create an app registration with Policy.Read.All Application permission (and grant admin consent) and a client secret (don’t forget to update the secret in the script before it expires)
– Copy the script below, fill the variables $tenantID, $appID,$appSecret (and $backupDir if you want it elsewhere)
– Schedule the script (based on my testing, SYSTEM/LOCAL SERVICE can’t be used for REST calls, so I’m using a user account for this purpose – to be fixed)
– If you need a “full backup” you can run the function Backup-AADCAs without the -ChangedOnly parameter

AzureAD App registration

In AzureAD go to App registrations and click on New registration:

Give the app a name and click on Register:

On the app registration page navigate to API permissions – Add a permission – Microsoft Graph – Application permissions

Add Policy.Read.All permission

You can remove the default User.Read permission, then grant admin consent for Policy.Read.All

Navigate to Certificates & secrets – New client secret – create a new secret (don’t forget to refresh the secret in the script before it expires)

Copy the client secret (use the copy button) and insert it in the $appSecret variable in the script

Head back to Overview, copy the tenant ID to $tenantID and application (client) ID to $appID

Schedule the scipt according to your needs.

Script

$backupDir = "C:\AAD_CA_Backup"

[Net.ServicePointManager]::SecurityProtocol = [Net.SecurityProtocolType]::Tls12

$tenantID = '' #tenantID
$appID = ''#appID
$appSecret = '' #appSecret
$scope = 'https://graph.microsoft.com/.default'
$oAuthUri = "https://login.microsoftonline.com/$TenantId/oauth2/v2.0/token"
$body = [Ordered] @{
    scope = "$scope"
    client_id = "$appId"
    client_secret = "$appSecret"
    grant_type = 'client_credentials'
}
$response = Invoke-RestMethod -Method Post -Uri $oAuthUri -Body $body -ErrorAction Stop
$aadToken = $response.access_token

#Conditional Access policies web request body
$url = 'https://graph.microsoft.com/v1.0/identity/conditionalAccess/policies'
$headers = @{ 
    'Content-Type' = 'application/json'
    'Accept' = 'application/json'
    'Authorization' = "Bearer $aadToken" 
}

function Import-AADCABackups {
    gci -File -Recurse $backupDir -Include *.json | % {
        [pscustomobject]@{
         ID = ($_.Name.Split("_"))[0]
         Version =[datetime]::ParseExact( ($_.BaseName.Split("_"))[1], 'yyyyMMddHHmm', $null)
         JSON = Get-Content $_.FullName #| ConvertFrom-Json
         Name = (Get-item $_.Directory).Name
         }
        }
}

function Backup-AADCAs {
    Param(
    [Parameter(Mandatory=$false)]
    [switch]$ChangedOnly
    )
    $import_CABackups = Import-AADCABackups

    $AAD_CAs = Invoke-WebRequest -Method Get -Uri $url -Headers $headers -ErrorAction Stop | ConvertFrom-Json | % {$_.value}
    $strDate = Get-date -Format yyyyMMddHHmm

    foreach ($CA in $AAD_CAs){
        #create backup directory if it does not exist
        if (!(Test-Path "$backupDir\$($CA.displayname)")){New-item -ItemType Directory -Path "$backupDir\$($CA.displayname)" >> $null }
        
        #load JSON
        $CA_JSON = $CA | ConvertTo-Json -Depth 6 -Compress
        
        #Export changes only  
        if ($ChangedOnly){
            $import_CABackup_latest_JSON = ($import_CABackups | where({$_.ID -eq $CA.id}) | sort version | select -Last 1).JSON
            #New CA
            if ($import_CABackup_latest_JSON -eq $null){
                Write-Host "New policy found: $($CA.DisplayName)" -ForegroundColor Green
                Out-File -InputObject $CA_JSON -Encoding utf8 -FilePath "$backupDir\$($CA.displayname)\$($ca.id)_$strdate.json"
                }
            #Difference found
            if (([bool]$import_CABackup_latest_JSON) -and ($import_CABackup_latest_JSON -ne $CA_JSON)){
                Write-Host "Found difference for $($CA.DisplayName)" -ForegroundColor Yellow
                Out-File -InputObject $CA_JSON -Encoding utf8 -FilePath "$backupDir\$($CA.displayname)\$($ca.id)_$strdate.json"
                }
            #No difference found
            if (([bool]$import_CABackup_latest_JSON) -and ($import_CABackup_latest_JSON -eq $CA_JSON)){
                Write-Host "No difference found for $($CA.DisplayName)" -ForegroundColor Cyan
                }

        #Export all
        }else{
            Out-File -InputObject $CA_JSON -Encoding utf8 -FilePath "$backupDir\$($CA.displayname)\$($ca.id)_$strdate.json"
         }
    }
    #Deleted CA
    $import_CABackups | ? {$_.id -notin $AAD_CAs.id} | % {
                Write-Host "Policy deleted in AzureAD: $($_.Name)" -ForegroundColor Red
                }

 }


Backup-AADCAs -ChangedOnly

Monitor AzureAD Conditional Access Policy changes with PowerShell (Scheduled Script)

When there are multiple administrators in an AzureAD tenant, it is inevitable that one may change settings in Conditional Access policies – without notifying everyone involved. To keep track of changes you could regualarly check the AzureAD audit logs, or have an automation for it. I may be a bit old-fashioned, but I prefer to have PowerShell scripts running on-premises with Scheduled Tasks, etc. So this blogpost covers monitoring Conditional Access policy changes with a scheduled PowerShell script.

TL;DR

  • Create a self-signed cert in your local machine store (on which you plan to run the scheduled script), export the public key
  • Create an App registration in your tenant, grant the following Microsoft Graph API permissions (type=Application):
    • AuditLog.Read.All
    • Directory.Read.All
  • Add the exported certificate to the app registration
  • Note the tenantID, clientID and certificate thumbprint
  • Install Microsoft.Graph.Authentication, Microsoft.Graph.Reports Powershell modules on the computer which will run the script
  • Modify the below script accordingly:
    • $scheduleMins = task repetition in minutes. It is used to filter audit records by timestamp. Eg.: when this value is set to 60 (run every hour) it will look for audit events generated in the past 60 minutes
    • $clientID = client ID/app ID of App registration created in your AzureAD tenant
    • $tenantID = your AzureAD tenant ID
    • $certThumbprint = thumbprint of the self signed certificate
    • modify Send-MailMessage parameters on the last line to your needs
  • Schedule the task (with admin rights to read the local machine cert store, I use SYSTEM account for this purpose, but this may be a security concern)

The script:

$scheduleMins = 60

$clientID = "" #clientID of registered app
$tenantID = "" #your tenant ID
$certThumbprint = "" #thumbprint of certificate used to connect to Graph API

Connect-MgGraph -ClientId $clientID -TenantId $tenantID -Certificate (gci Cert:\LocalMachine\my | ? {$_.thumbprint -eq $certThumbprint }) -ContextScope Process

### Search audit logs
    $activitiesAfter = (Get-date).AddMinutes(-$scheduleMins).ToUniversalTime().ToString("yyyy-MM-ddTHH:mmZ")
    $CAAuditEvents = Get-MgAuditLogDirectoryAudit -filter "(activityDateTime ge $activitiesAfter) and (loggedbyservice eq 'Conditional Access')"
    #exit if no events found
    if (!($CAAuditEvents)){exit}

## Create report
$report_CAEvents = foreach ($event in $CAAuditEvents){
    [pscustomobject]@{
        Action = $event.ActivityDisplayName
        UTCTimeStamp = $event.ActivityDateTime
        Initiator = $event.InitiatedBy.User.UserPrincipalName
        TargetDisplayName = $event.TargetResources.DisplayName
        TargetID = $event.TargetResources.Id
        OldValue = $event.TargetResources.modifiedproperties.OldValue
        NewValue = $event.TargetResources.modifiedproperties.NewValue
        }
    }

$Header = @"
<style>
TABLE {border-width: 1px; border-style: solid; border-color: black; border-collapse: collapse;}
TD {border-width: 1px; padding: 3px; border-style: solid; border-color: black;}
TH {border-width: 1px; padding: 3px; border-style: solid; border-color: black;}
</style>
"@
[string]$html = $report_CAEvents | ConvertTo-Html -Title "Conditional Access policy modifications" -Head $Header

Send-MailMessage -From <sender> -to <recipient> -Subject "Conditional Access Policy change alert" -Body $html -BodyAsHtml -SmtpServer <smtpserver> -Port 25

The result will show the old and new JSON notation of each modified policy. When there is a new policy, OldValue column will be empty, while a deleted policy’s report will have an empty NewValue column (for logical reasons 🙂 )

Example output

Check if IP address is already an AzureAD Named Location using PowerShell

In a large corporate environment, it’s not unusual to have several Azure AD Named Locations (should it be trusted or not). It is even more challenging to keep track of these locations when there are several admins managing the environment. I thought it would be useful to have a script to determine if an IP address is already listed as a Named Location.

This scipt will require AzureAD PowerShell module and an AzureAD account with appropriate privileges to read these locations. It contains a stripped version of the IPInRange tool (link) and basically uses this tool to check if an IP is in one of the Named Location IP ranges.

Sample results:

Result for an IP that is already an AzureAD Named Location
Result for an IP that is not listed in any AzureAD Named Location

Error: This mailbox database is associated with one or more move requests…

Recently, I was migrating from Exchange 2016 to Exchange 2019 and when I tried to uninstall Exchange 2016, I encountered the following error message:

Error: This mailbox database is associated with one or more move requests. To get a list of all move requests associated with this database, run Get-MoveRequest -SourceDatabase and Get-MoveRequest -TargetDatabase . To remove a move request, run Remove-MoveRequest .

I went through the basics:

  • ran the Get-MoveRequest commands with mailbox databases attached (with and without specifying the database), but it didn’t list any request
  • ran the Get-MailboxExportRequest command and removed the completed exports

After googling a while, I found this Microsoft document which seemed unrelevant at first: Can’t move mailboxes to Exchange Online – Exchange | Microsoft Docs

But this phrase took my attention:

Another example for an orphaned local move request for a primary or archived mailbox would be if there’s no move request on-premises for it, but there are attributes set, such as the following:

  • msExchMailboxMoveSourceMDBLink
  • msExchMailboxMoveTargetMDBLink
  • msExchMailboxMoveSourceArchiveMDBLink
  • msExchMailboxMoveTargetArchiveMDBLink

A quick Powershell oneliner (below) found 3 users with msExchMailboxMoveTargetMDBLink attribute set to the mailbox database, but these users were already migrated to Exchange Online, so I was sure that this is an orphaned local move request.

get-aduser -Properties msExchMailboxMoveTargetMDBLink -filter * | ? {$_.msExchMailboxMoveTargetMDBLink}

So the solution was to open ADUC, find the users, open Attribute Editor, find the attribute and hit on Clear:

Use at your own risk 🙂

Backup AzureAD Conditional Access Policies – a different approach

Update: as the AzureAD PowerShell is being deprecated, I made an updated version which can be found here

Backing up AAD Conditional Access policies is relatively straightforward with Get-AzureADMSConditionalAccessPolicy cmdlet (don’t forget to update your AzureAD module if the cmdlet is not recognized). In this post, I want to share my own backup “solution” which can detect changes based on the previously exported settings.

TL;DR
– Ensure you have appropriate permissions to read AAD CAs
– Make sure to use up-to-date AzureAD PowerShell module
– Modify $backupDir variable accordingly

The script is here

Explained

First, we define the directory where policies will be exported, which will be created if it does not exist:
$backupDir = "C:\AAD_CA"
if (!(Test-Path $backupDir)){mkdir $backupDir}

Next, we connect to AzureAD. Enter appropriate credentials in popup window
Connect-AzureAD

The next part is a function declaration, which imports the previous exports from the backup directory as a custom object:

function Import-AADCABackups {
gci -File -Recurse $backupDir -Include *.json | % {
[pscustomobject]@{
ID = ($_.Name.Split(""))[0] Version =[datetime]::ParseExact( ($.BaseName.Split(""))[1], 'yyyyMMddHHmm', $null)

JSON = Get-Content $.FullName
Name = (Get-item $_.Directory).Name
}
}
}

The main function is called Backup-AADCAs which has an optional parameter -ChangedOnly

function Backup-AADCAs {
Param(
[Parameter(Mandatory=$false)]
[switch]$ChangedOnly
)

It first imports the previous backups using the Import-AADCABackups function, then the actual ones using Get-AzureADMSConditionalAccessPolicy

$import_CABackups = Import-AADCABackups
$AAD_CAs = Get-AzureADMSConditionalAccessPolicy

After storing the actual date in $strDate variable, we loop through each actual policy:
– create a subdirectory with the same name as the policy
– convert the policy to JSON, store it in $CA_JSON variable

foreach ($CA in $AAD_CAs){
        #create backup directory if it does not exist
        if (!(Test-Path "$backupDir\$($CA.displayname)")){New-item -ItemType Directory -Path 

"$backupDir\$($CA.displayname)" >> $null }
        #load JSON
        $CA_JSON = $CA | ConvertTo-Json -Depth 6 -Compress

If the function is called with -ChangedOnly parameter, the following happens:
– try to find an existing backup of the policy based on its ID and select the latest one:

$import_CABackup_latest_JSON = ($import_CABackups.where({$_.ID -eq $CA.id}) | sort version | select -Last 1).JSON

– if it didn’t find a match, the CA in the loop is considered new and backup is created:

#New CA
           if ($import_CABackup_latest_JSON -eq $null){
                Write-Host "New policy found: $($CA.DisplayName)" -ForegroundColor Green
                Out-File -InputObject $CA_JSON -Encoding utf8 -FilePath "$backupDir\$($CA.displayname)\$($ca.id)_$strdate.json"
                }

– if there was a match, but the latest one’s content differs from the actual one, then it is considered to be changed:

#Difference found
            if (([bool]$import_CABackup_latest_JSON) -and ($import_CABackup_latest_JSON -ne $CA_JSON)){
                Write-Host "Found difference for $($CA.DisplayName)" -ForegroundColor Yellow
                Out-File -InputObject $CA_JSON -Encoding utf8 -FilePath "$backupDir\$($CA.displayname)\$($ca.id)_$strdate.json"
                }

– if there was a match and it’s content is the same as the actual one, then there was no change:

#No difference found
            if (([bool]$import_CABackup_latest_JSON) -and ($import_CABackup_latest_JSON -eq $CA_JSON)){
                Write-Host "No difference found for $($CA.DisplayName)" -ForegroundColor Cyan
                }

If -ChangedOnly parameter is not used, then everything is exported:

        }else{
            #Export all
            Out-File -InputObject $CA_JSON -Encoding utf8 -FilePath "$backupDir\$($CA.displayname)\$($ca.id)_$strdate.json"
         }
    }

The function lists those policies that were deleted:

#Deleted CA
    $import_CABackups | ? {$_.id -notin $AAD_CAs.id} | % {
                Write-Host "Policy deleted in AzureAD: $($_.Name)" -ForegroundColor Red
                }
 }

At the end of the script we chose if we call the Backup-AADCAs function with or without -ChangedOnly parameter.

Backup-AADCAs -ChangedOnly

Caveat: if a policy is renamed, a new directory is created. However, since the detection is based on policy ID, no other confusion should occour.

Grant admin consent to an AzureAD application via PowerShell

Recently, I was scripting an Azure App registration workflow and had some headaches figuring out how to grant admin consent to the application with PowerShell. Actually, if AzureCLI is installed you can use the following command:

az ad app permission admin-consent --id <application id>

However, I wanted to find some native PowerShell-way to solve this problem and this is how I came accross this solution. So this script is basically giving some context to the answer provided by Kitwradr.

TL;DR

  1. Ensure you have AzureRM Powershell module installed
  2. Modify the $appName variable’s value accordingly
  3. When prompted, enter tenant admin credentials

Script:

$appName = "<AppDisplayName>"

Login-AzureRmAccount
$context = Get-AzureRmContext
$tenantId = $context.Tenant.Id
$token = [Microsoft.Azure.Commands.Common.Authentication.AzureSession]::Instance.AuthenticationFactory.Authenticate($context.Account, $context.Environment, $TenantId, $null, "Never", $null, "74658136-14ec-4630-ad9b-26e160ff0fc6")
$headers = @{
  'Authorization' = 'Bearer ' + $token.AccessToken
  'X-Requested-With'= 'XMLHttpRequest'
  'x-ms-client-request-id'= [guid]::NewGuid()
  'x-ms-correlation-id' = [guid]::NewGuid()}

$azureApp = Get-AzureRmADApplication -DisplayName $appName
$azureAppId = $azureApp.ApplicationId
$url = "https://main.iam.ad.ext.azure.com/api/RegisteredApplications/$azureAppId/Consent?onBehalfOfAll=true"
Invoke-RestMethod -Uri $url -Headers $headers -Method POST -ErrorAction Stop
API permissions before running the script
API permissions after running the script

Enjoy 🙂

Querying AzureAD App registration credential expiration

Recently, I came across an interesting post on monitoring Azure AD App registration expiration – link here. I made a simplified version which only generates a report on the expiration date of each credential.

TL;DR
Running the script below will list each credential for AzureAD app registrations sorted by expiration date. To run the script, ensure you have AzureRM PowerShell module installed and you have appropriate permissions to read the informations.

Connect-AzureRmAccount
$RM_Apps = Get-AzureRmADApplication

$RM_Apps_Cred = foreach ($app in $RM_Apps){
    $tmp_cred = Get-AzureRmADAppCredential -ObjectId $app.objectid
    $tmp_cred | % {
        [pscustomobject]@{
        App = $app.DisplayName
        ObjId = $app.objectId
        CredType = $_.Type
        StartDate = $_.StartDate
        EndDate = $_.EndDate
    }
    Clear-Variable tmp_cred
 }
}

$RM_Apps_Cred | sort endDate | ft

To list only the latest credential for each application by type, the following will do:

#List only latest credentials
$RM_Apps_Unique = $RM_Apps_Cred | select app,credtype -Unique
$RM_Apps_Cred_latest = foreach ($obj in $RM_Apps_Unique){
    $RM_Apps_Cred.Where({($_.credtype -eq $obj.credtype) -and ($_.app -eq $obj.app)}) | sort enddate | select -Last 1
    }

$RM_Apps_Cred_latest | sort app | ft