Fighting AzureAD App registration client secrets – step2: limiting app password lifetime

Disclaimer: the following configurations require Microsoft Entra Workload Identities Premium licence (link)

In my previous post, I highlighted the risks of using password credentials for apps and how to spot client secret usage for service principals. This post will focus on limiting password lifetime for apps (scoped to tenant or specific application level) which can be configured if your tenant has Workload Identities Premium licence – otherwise you will receive the following error:

To add and configure organizational settings,you'll need to link a subscription with Azure AD Workload identity license to your tenant.

Error message when no Workload Identities Premium is linked to the tenant

As per the documentation, apps and service principals can have restrictions at object level and tenant level. Scoped restrictions take precedence over tenant level settings and only one policy object can be assigned to an application or service principal (link).

Create a tenant level restriction

For demo purposes, I will create a simple setting which restricts password lifetime to 1 year for applications. I’m using Graph Explorer for simplicity. This action requires Policy.ReadWrite.ApplicationConfiguration right, make sure you are using an account with this privilege and consented.

The endpoint is https://graph.microsoft.com/v1.0/policies/defaultAppManagementPolicy, PATCH method is needed. The request body is as follows:

{
"isEnabled": true,
"applicationRestrictions": {
"passwordCredentials": [
{
"restrictionType": "passwordLifetime",
"maxLifetime": "P12M"
}
]
}
}

Creating a sample defaultAppManagementPolicy

The result is almost instant:

Password lifetime longer than 1 year is greyed out when adding a new client secret to an app

Create an appManagementConfiguration and assign to an app

We may want to further restrict some apps to have a shorter password lifetime, so we create a separate policy and assign it to the application. As per the documentation, assigning requires Application.Read.All and Policy.ReadWrite.ApplicationConfiguration – for me, it wasn’t enough, I received the following error:

Insufficient privileges to complete the operation.

I added Application.ReadWrite.All to my permission set and the error disappeared.

So, first, we will create the configuration object (documentation), which will restrict password lifetime to 6 months. The payload is the following:

{
"displayName": "F12 - App password max lifetime 6 months",
"description": "App password max lifetime 6 months",
"isEnabled": true,
"restrictions": {
"passwordCredentials": [
{
"restrictionType": "passwordLifetime",
"maxLifetime": "P6M"
}
]
}
}

It needs to be POST-ed to https://graph.microsoft.com/v1.0/policies/appManagementPolicies:

Creating an appManagementPolicy in Graph Explorer

Take note of the result, the policy ID will be used in the following step.

Next is to assign this policy to the application object (documentation). We will POST to https://graph.microsoft.com/v1.0/applications/{id}/appManagementPolicies/$ref this payload. Hint: {id} is the application’s objectID not the client ID.

{
"@odata.id":"https://graph.microsoft.com/v1.0/policies/appManagementPolicies/{id}"
}

Assigning appManagementPolicies to an app

Let’s verify the result:

App password lifetime limited by appManagementPolicy

Disable password creation for apps

The most restrictive policy is to prohibit password creation. This can be achieved using the same method described above, with this example payload:

{
    "displayName": "F12 - APPS - No password allowed",
    "description": "No password allowed for apps",
    "isEnabled": true,
    "restrictions": {
        "passwordCredentials": [
            {
                "restrictionType": "passwordAddition",
                "maxLifetime": null
            }
        ]
    }
}

The result is a warning message and the "New client secret" option greyed out:
Password addition disabled for this app

There are many other aspects of a service principal/app credential which can be managed this way, ie.: symmetricKeyAddition, customPasswordAddition, asymmetricKeyLifeTime which may worth considering (and I hope to have an occasion to try them and share my experiences).

To be continued 🙂

Fighting AzureAD App registration client secrets – step1: reviewing client secret usage

Workload identity (including service principals) security keeps bugging me, especially the password credentials (aka client secret). I’m sure there are scenarios where this is the only option, but I see environments where these are used just because it is easier to implement. And one day I woke up and realized how dangerous it can be – so now I’m fighting client secrets as hard as I can.

TL;DR
– Why: a leaked client secret can be easly used without being noticed (or hardly noticed… you may keep an eye on risky workload identities or have other solutions in place)
– How:
– review client secret usage and try to migrate to certificate based auth,
– at least don’t store these secrets hard coded in scripts or programs,
– use conditional access for workload identities (Microsoft Entra Workload Identities licence is required),
– limit password lifetime (Microsoft Entra Workload Identities licence is required)

This is a huge topic, so I will split it into some form of “series”.

So let’s start with the Why?
As I mentioned, a leaked credential can be hard to notice (if there is no monitoring in place, or IT is not aware of the available option to review risky workload identities). In AzureAD – Security – Identity Protection (Entra: Protect&secure – Identity Protection) you can find “Risky workload identities” (will be discussed in other post).

Let’s imagine a targeted brute force scenario: to access resources using a service principal, you need 3 things: the tenant ID, the application ID and the password. Tenant ID for a domain can be easily acquired, the easiest way is navigate to AzureAD – External Identities – Cross-tenant access settings – Add organization, then enter the domain name:

Gathering tenant ID for a domain

Guessing an application ID is nearly impossible, however, with enough compute power, this is only a matter of time: when someone tries to access your tenant using a non-existent app id, the response will be an HTTP 400 (Bad request) error with the following message:

“error”:”unauthorized_client”,”error_description”:”AADSTS700016: Application with identifier ‘<appID>’ was not found in the directory ‘<tenant>’

On the other side, when using an existing app id with a wrong password, the response will be an HTTP 401 (Unauthorized) error with the following message:

“error”:”invalid_client”,”error_description”:”AADSTS7000215: Invalid client secret provided. Ensure the secret being sent in the request is the client secret value, not the client secret ID, for a secret added to app ‘<appID>’

The last step is to brute force every password combination 😁 Okay, okay it is already hard to get the app ID and it is even more difficult to pick an app that has password credentials then guess these credentials, but not impossible. And I’m sure there are more sophisticated ways to skip to the last step (eg.: by default a user can read app registrations in the AzureAD portal and even read the hint for the Client secret value).

A non-privileged user has access to the client secret hint by default

Sure, you can review Service principal sign-ins from the portal to detect anomalous activities, but this sounds a very tedious task – unless you have some monitoring solution in place.

How to spot client credentials usage?

My first step towards achieving a password-free environment is to make an inventory of apps with client secrets. In my previous post, I wrote some words about AzureADToolkit and shared a custom script to get a report on these apps with the API permissions assinged.

This time, I’m focusing on sign-in activity to find active password credential usage. When we switch to “Service principal singn-ins” in Sign-in logs menu on the AzureAD portal, we can filter the results by client credential type used:

Filtering logs by client credential type

While this may be enough for a one time review, you may want to monitor password usage later. I prefer to have a PowerShell script for this purpose, but there are certainly other solutions available. I didn’t find ready-to-use cmdlets to query service principal sign-ins so I chose the hard way to write my own. Using developer tools in the browser (by hitting F12 😉) we can analyze Network traffic when opening a service principal sign-in event:

Request URL for a service principal sign-in event

What we need to see here is that the portal is using the Graph API beta endpoint and at the end of the request the source is specified source=sp where sp probably stands for “service principal”. To filter by client secret usage, we will use the ‘clientCredentialType eq clientSecret’ clause. To access sign-in information, the identity used requires ‘AuditLog.Read.All’ permission on Microsoft Graph. If you want to access these informations in an unattended manner (eg.: a scheduled task), you need to create a new app registration, grant the permission (application type) with admin consent and provide a credential (hopefully a certificate 😅)

Quick guide:

1. Create an app registration
2. Remove default permission, add AuditLog.Read.All Application permission and grant admin consent

3. Create a self-signed certificate (use admin PS if you want it to be created in Local Machine container; in a highly secure scenario, you can disable private key export by appending ‘-KeyExportPolicy NonExportable’):

New-SelfSignedCertificate -FriendlyName "F12 - SP client secret usage monitor" -NotAfter (Get-date).AddYears(2) -Subject "F12 - SP client secret usage monitor" -Container Cert:\LocalMachine\My\ -Provider “Microsoft Enhanced RSA and AES Cryptographic Provider” -KeyExportPolicy NonExportable

4. Export the certificate in cer format (only the cert, not the private key)

5. Upload the certificate on the app registration page

Now we have the right app for our needs, let’s query the information needed. To use certificate authentication, we will install MSAL.PS module for simplicity:

Install-Module MSAL.PS

The following script will write out client secret usage for the last 24 hours:

Import-Module msal.ps
$tenantID = '<tenantID>'
$appID = '<app ID>'
$certThumbprint = '<certificate thumbprint created for the app>'
$token = Get-MsalToken -TenantId $tenantID -ClientId $appID -ClientCertificate (get-item Cert:\LocalMachine\my\$certThumbprint) -AzureCloudInstance 1

#query sign-ins for the last 24 hours
$dateStart = (([datetime]::UtcNow).AddHours(-24)).ToString("yyyy-MM-ddTHH:mm:ss.fffZ")
$url = "https://graph.microsoft.com/beta/auditLogs/signIns?api-version=beta" + '&$filter=createdDateTime%20ge%20' + $($datestart) + "%20and%20createdDateTime%20lt%20" + (([datetime]::UtcNow).ToString("yyyy-MM-ddTHH:mm:ss.fffZ")) + ' and clientCredentialType%20eq%20' + "'clientSecret'&source=sp"

$servicePrincipalSignIns = $null
while ($url -ne $null){
        #Write-Host "Fetching $url" -ForegroundColor Yellow
        $response =  Invoke-RestMethod -Method Get -Uri $url -Headers @{ Authorization = $Token.CreateAuthorizationHeader()}
        $servicePrincipalSignIns += $response.value
        $url = $response.'@odata.nextLink'
    }

$servicePrincipalSignIns | select createdDateTime,ipAddress,serviceprincipalname,clientCredentialType
Sample output

To be continued…

AzureAD App registrations – the “application” permission + credentials combination security nightmare

When talking about Azure AD security, we tend to put less focus on service principals/app registrations*. But when we take into consideration that these principals can have assigned API permissions and “static” credentials (certificate or password) and that these credentials in the wrong hands can cause serious damage, we may change our attitude.
* While “App registrations” and “service principals” are different entites (link) they can be used interchangeably (link)

TL;DR
– Follow best practices for securing service principals: Conditional Access for workload identites, review AAD roles and API permissions of SPs, review SP sign-in logs, pioritize key credential usage over password credentials
– Explore the AzureADToolkit to gain insights on application credentials and API permissions
– Try out my script to start reviewing apps with Application type API permissions

Imaginary example: an IT admin created an app registration which is used in a PowerShell script for some repetitive tasks. The app was granted Directory.ReadWrite.All API permission (Application type, admin consent granted) on Microsoft Graph and a client secret was generated for the app – and this secret is saved as plain text in a script, along with the tenant id and app id. Something like this:

If this script gets into the wrong hands… what a nightmare! 😱

What to do with these app registrations?
Follow security best practices:
– Apply Conditional Access to workload identities (link)
– Review sign-in logs (service principal sign-ins)
– Implement a credential rotation process (especially when key/password credentials are/were accessible for a leaver)
– Review service principals with AzureAD role granted (in preview)

– Prefer key credentials over password credentials (link), don’t store password credentials hardcoded if possible
– Review API permissions for App Registrations
– Identify, investigate and remediate risky workload identities (link)

AzureADToolkit
The last page linked is referencing a very cool toolkit, the AzureADToolkit which can easily identify service principals that have credentials.

Install-Module AzureADToolkit
Connect-AADToolkit
Get-AADToolkitApplicationCredentials | Out-GridView
Sample result of Get-AADToolkitApplicationCredentials

The other useful cmdlet in the toolkit (Build-AzureADAppConsentGrantReport) returns all service principals that have admin consented permissions (each entry contains the resource displayname and the permission* ie.: Microsoft Graph, User.Read)

*sometimes it’s unable to return all the info, in my case the following application has Exchange.ManageAsApp permission, but this property is empty

The two commands combined are probably able to display information for app registrations with admin constented API permissions that have credentials… but, to be honest, I already prepared a script to gather this info when I found that toolkit 🙃

Report script

The following script will return those app registrations that have active (non-expired) credentials, with admin consent on application type API permissions (delegated permissions are intentionally filtered out because those are tied to the authenticated users’ delegated permissions)

Connect-MgGraph

#List apps with cert or password credentials
$apps = Get-MgApplication -all | ? {($_.KeyCredentials -ne $null) -or ($_.PasswordCredentials -ne $null)}
# filter apps with expired credentials
$apps_activeCred = foreach ($app in $apps){ if ((($app.KeyCredentials.EndDateTime | sort -Descending | select -First 1) -gt (get-date)) -or (($app.PasswordCredentials.EndDateTime | sort -Descending | select -First 1) -gt (get-date))){$app}}

function Get-ServicePrincipalRoleAssignmentReadable ($appId){
#query apps that have application permissions with admin consent
$roleAssignments = Get-MgServicePrincipalAppRoleAssignment -ServicePrincipalId (Get-MgServicePrincipal -Filter "appId eq '$($appid)'").id
#match permission entries with resource name and permission name
foreach ($roleAssignment in $roleAssignments){(Get-MgServicePrincipal -ServicePrincipalId $roleAssignment.ResourceId) | select @{'L'="Value";'E'={"$($_.DisplayName)/"+($_.AppRoles | ? {$_.id -eq $roleAssignment.approleid}).value}} }
}

$report = foreach ($app in $apps_activeCred){
[pscustomobject]@{ 
    Name = $app.DisplayName
    AppId = $app.AppId
    LatestKeyExpiration = $app.KeyCredentials.enddatetime | sort -Descending | select -First 1
    LatesPasswordCredential = $app.PasswordCredentials.enddatetime | sort -Descending | select -First 1
    APIPermissions = (Get-ServicePrincipalRoleAssignmentReadable -appId $app.AppId).value

    }
} 
#filter out apps with no application type permissions
$report | ? {$_.apipermissions} | Out-GridView
Sample result of the script

Edge Drop vs. SharePoint’s access control policy for unmanaged devices

The Edge Drop is a really wonderful feature, but my inner data protection officer was bugging me to investigate if it is safe in an enterprise (or SMB) environment. There are several options to protect corporate data (labels, App Protection Policy, DLP policies, etc.) but not every business is lucky enough to afford the required licenses or to implement all these funcitionalities. Anyways, the goal of this post is to raise awareness and a call for action to evaluate the current policies with Edge’s Drop feature in focus.

TL;DR
– Edge’s Drop feature’s impact depends on the current policies and needs
– SharePoint access control policies don’t protect against Drop
– You may want to disable Drop, even if it is currently limited to Windows/macOS

What is Edge Drop?
It’s is actually a chat with yourself with the option to share files (MS doc). Files are stored in the user’s OneDrive for Business “Microsoft Edge Drop Files” folder. All you need to do is to log in to the Edge browser and it is ready to use (first time use will need 1-2 minutes to set up).

Problem statement
From the AzureAD perspective, this action is actually a sign-in to the “Microsoft Edge” application. If you have policies targeting the “Office 365 SharePoint Online” application (eg. policies created by SharePoint admin center’s Unmanaged devices access control setting) or “Office 365” application these policies may not apply. This can lead to accidental data loss.

In the following scenarios, the primary principal is that cloud content should be accessible only on managed devices (Hybrid AzureAD joined or compliant) – other devices are blocked (or restricted to view-only web access).

Scenario1
SharePoint admin center – Policies – Access control – Unmanaged devices = Allow limited, web-only access

This setting creates two Conditional Access policies:
1. [SharePoint admin center]Block access from apps on unmanaged devices
Office 365 SharePoint Online app, Client app = Mobile apps and desktop client as condition, Require device to be marked as compliant or Require Hybrid Azure AD joined device as Grant control
2. [SharePoint admin center]Use app-enforced Restrictions for browser access
Office 365 SharePoint Online app, Client app = Browser as condition, Use app enforced restrictions as session control

Experience on a non corporate device, user is logged in to Edge:

As you can see, there is a warning message that files can’t be downloaded and there is no download option in OneDrive. Let’s see if Drop allows downloading:

Yes, it does… too bad.

Scenario2
SharePoint admin center – Policies – Access control – Unmanaged devices = Block access

This option generates one Conditional Access policy:
[SharePoint admin center]Use app-enforced Restrictions for browser access
Office 365 SharePoint Online app, Client app = Browser as condition, Use app enforced restrictions as session control

Experience on a non corporate device, user is logged in to Edge:

As you can see, OneDrive can’t be opened from the browser. Let’s see if Drop is dropped 🙂

Not really…

Scenario3
Conditional Access Policy – Office365 – Require device to be marked as compliant or Require Hybrid Azure AD joined device as Grant control

The purpose of this demo policy is to restrict every Office 365 resource to managed devices (including OneDrive). This is a more restrictive policy, needs a lot of planning and testing before implementing – but I hope it will prevent Drop from downloading files on unmanaged devices.

Experience: if I do not register the device, it keeps asking me to log in:

When I register the device, access is blocked:

How about Drop? This time it is stuck in Initializing, files can’t be downloaded (same experience when device is not registered):

Okay, this policy does the trick but it has large impact on users.

A less painful solution is to disable Drop on managed devices (link), so users won’t be able to upload files from corporate devices. This can be done via Group Policy or Intune configuration profile. However, the setting is only supported on Windows and macOS devices, so users will not be prevented from uploading via Drop on iOS/Android devices.

Conclusion: several other scenarios are possible and several tools are available to prevent accidental data leak, which will not be covered in this post because it aims only to raise awareness, make you review your security settings when new features are available in your tenant.

“Don’t do that” series – migrate personal user profile to (Azure)AD user profile with Win32_UserProfile.ChangeOwner method

Scenario: the business is now convinced that computers should be managed centrally (either with Active Directory or Azure Active Directory) instead of having WORKGROUP computers.
Problem: after joining to (Azure)AD, users will have a new profile created. Gone are their settings, wallpaper, pinned icons, etc. You need to note these settings, copy the files to the new profile and so on.
After searching the net you may come across 3rd party solutions to address this headache – or decide to find some Microsoft way to do this.
The “don’t do that” solution: use the Win32_UserProfile class’ ChangeOwner method (link).
Why: several settings are tied to cached credentials (these should be entered again), some icon pinnings on the taskbar will lose, but the worst thing is that some settings may be tied to a personal account that should be migrated to the work account (user is logged in to a personal Microsoft account, OneDrive is syncing business data to personal OneDrive, etc.) – with a lift-and-shift approach, these settings will remain in place and this should be avoided.

DISCLAIMER: I’m just sharing this “don’t do that” tutorial just in case someone has the same idea that Win32_UserProfile.ChangeOwner is a good solution. If you considered the above and still want to give it a try, do this on your own risk. There are tools available to accelerate the process developed or recommended by Microsoft (USMT or PCmover Express [link])

So what I did after AzureAD joining the computer was to log in with the AzureAD account of the user and noted the profile path for both the personal account and the corporate account. Login created the local profile, but since the Win32_UserProfile.ChangeOwner method fails if the source or the target profile is loaded, another admin is required to perform the changes. So I logged in with a Global Administrator (the two accounts logged off), then launched PowerShell (as admin) – or you can configure additional administrators [link].

gwmi -query "select * from Win32_UserProfile" | ft localpath,sid
User profiles created on the computer

Next was to store the personal profile in a variable:

$profileToReplace = gwmi -query "select * from Win32_UserProfile" | ? {$_.localpath -eq 'C:\Users\kovac'}

Then call the ChangeOwner method:

$profileToReplace.ChangeOwner('<sid of AzureAD profile>', 1)
Using the ChangeOwner method

Now when the user logs on using AzureAD credentials, the old profile is loaded with almost all settings… Almost. First welcome message:

Cached credentials missing

The same applies to Edge/Chrome profile. Another inconvenience I noticed is that pages pinned by Edge to taskbar are missing their icons:

Icons before
Icons after

The login screen still shows the personal profile (if the users logs in a new local profile will be created). This entry can be removed with netplwiz:

Remove personal account using netplwiz

And at this point I realized that lot of settings can be tied to a personal cloud account which you may not want to migrate to the business profile – so I didn’t go further, but wanted to share my experiences so that you can learn from my mistake 🙂

SharePoint Online external file sharing report using Graph API and PowerShell

The story in short: one of my customers asked me if it is possible to generate a report on all content in Office365 shared externally. Doing some searches I found the following solutions:
– Run the sharing reports on each site and each OneDrive (link, link)
– Run reports based on audit logs (link)

While these reports may seem adequate for most of the time, I have some issues with them:
– Native reporting capabilities require opening each site manually, these reports contain internal sharings too
– Some info may be missing from these native reports (for example: expiration date for date limited sharing links, password protection property, email address of tenant guests who haven’t opened the link yet)
– Audit log based reports’ capabilities are limited by audit log retention

So these issues gave me the intention to write a script that fits my needs and I hope others will benefit from it too.

Important note: Sharepoint sharing will soon default to Azure B2B Invitation Manager for External Sharing (link). You may want to review your affected settings (link).

TL;DR

  • create an app registration in your tenant with the following application permissions for Graph API:
    • Sites.Read.All (will be needed for accessing every SPO site)
  • give the app a client secret which will be used to authenticate
    • it is more secure to opt for certificate-based auth (great article on this here) but I did not have the occasion to test it, so I stay with client secret now
  • Copy the following script below
  • $tenantID, $appID, $appSecret variables need to be declared
  • The script has one required parameter (-ReportFile) which should be the path of the HTML report (parent directory must exist) and two mutually exclusive parameters: -All if you want a report on all document libraries in your tenant (SharePoint sites and OneDrive for Business too) OR -SiteUrl <string[]> which will report only on the site specified. Example:
    • PS C:\temp> .\Get-SPExternalSharingReport.ps1 -ReportFile C:\temp\reportDaniel1.html -SiteUrl “https://ftwelvehu-my.sharepoint.com/personal/daniel_f12_hu”
    • PS C:\temp> .\Get-SPExternalSharingReport.ps1 -ReportFile C:\temp\reportAll.html -All
[CmdletBinding(DefaultParametersetName="default")]
Param(
    [Parameter(Mandatory=$true)][string]$ReportFile,
    [parameter(ParameterSetName="seta")][string]$SiteUrl,
    [parameter(ParameterSetName="setb")][switch]$All
)

[Net.ServicePointManager]::SecurityProtocol = [Net.SecurityProtocolType]::Tls12
$tenantID = '<tenantID>'
$appID = '<application (client) ID>'
$appSecret = '<client secret>'
$scope = 'https://graph.microsoft.com/.default'
$oAuthUri = "https://login.microsoftonline.com/$TenantId/oauth2/v2.0/token"
$body = [Ordered] @{
    scope = "$scope"
    client_id = "$appId"
    client_secret = "$appSecret"
    grant_type = 'client_credentials'
}
$headers = $null
$tokenexpiration = $null
function Check-AccessToken {
    if (((Get-date).AddMinutes(5)) -gt $tokenExpiration){
     #   Write-host "Token expires in 5 minutes, refreshing" -ForegroundColor Yellow
        $response = Invoke-RestMethod -Method Post -Uri $oAuthUri -Body $body -ErrorAction Stop
        $AccessToken = $response.access_token
        $script:tokenExpiration = (Get-date).AddSeconds($response.expires_in)
    
#Define headers
    $script:headers = @{ 
    'Content-Type' = 'application/json'
    'Accept' = 'application/json'
    'Authorization' = "Bearer $AccessToken" 
    }
 }
}
Check-AccessToken
#Get all SP sites
$url = 'https://graph.microsoft.com/v1.0/sites/'
$spSites = $null
while ($url -ne $null){
        Check-AccessToken
        $json_response =  (Invoke-WebRequest -Method Get -Uri $url -Headers $headers -ErrorAction Stop | ConvertFrom-Json) 
        $spSites += $json_response.value
        $url = $json_response.'@odata.nextLink'
    }


function Get-SPSiteSharedDocuments ($siteID){
    $url = "https://graph.microsoft.com/v1.0/sites/$($siteid)/lists"
    $obj_DocumentsList = $null
   # Write-host -ForegroundColor Yellow "Querying site lists $url"
    while ($url -ne $null){
        Check-AccessToken
        $json_response =  (Invoke-WebRequest -Method Get -Uri $url -Headers $headers -ErrorAction Stop | ConvertFrom-Json) 
        $obj_DocumentsList += $json_response.value
        $url = $json_response.'@odata.nextLink'
    }
    $obj_DocumentsList = $obj_DocumentsList | ? {$_.list.template -match "DocumentLibrary"} #mySiteDocumentLibrary for OneDrive, documentLibrary for SP
    foreach ($doclib in $obj_DocumentsList){
        $url = "https://graph.microsoft.com/v1.0/sites/$($siteid)/lists/$($doclib.id)/items?expand=driveitem"
   #     Write-host -ForegroundColor Yellow "Querying documents $url"
        $ListItems = $null
        while ($url -ne $null){
            Check-AccessToken
            Write-host -ForegroundColor Yellow "Querying documents $url"
            $json_response =  (Invoke-WebRequest -Method Get -Uri $url -Headers $headers -ErrorAction Stop | ConvertFrom-Json)
            $ListItems += $json_response.value
            $url = $json_response.'@odata.nextLink'
    }   
    $ListItems | % {$_.driveitem} | ? {$_.shared}
    }
}

function Get-SPSharedDocumentPermission ($driveID,$docID){
    $url = "https://graph.microsoft.com/v1.0/drives/$($driveID)/items/$($docID)/permissions"
   # write-host $url -ForegroundColor Yellow
    $obj_Permissions = $null
        while ($url -ne $null){
        Check-AccessToken
        $json_response =  (Invoke-WebRequest -Method Get -Uri $url -Headers $headers -ErrorAction Stop | ConvertFrom-Json)
        $obj_Permissions += $json_response.value
        $url = $json_response.'@odata.nextLink'
    }   

    $obj_Permissions | ? {$_.link}
    }

function Get-SPSiteDocSharingReport ($siteid){
 foreach ($item in (Get-SPSiteSharedDocuments $siteid)){
    Get-SPSharedDocumentPermission $item.parentreference.driveid $item.id | % {
    [pscustomobject]@{
        WebURL = ($spSites.Where({$_.id -eq $siteid}) ).weburl
        Path = $item.parentReference.path.substring($item.parentReference.path.indexof("root:"))
        ItemName = $item.name
        Role = $_.roles -join ","
        HasPassword = $_.haspassword
        ExpirationDate = $_.expirationDateTime
        Scope = $_.link.scope
        GuestUserMail = ($_.grantedtoidentitiesv2.siteuser.loginname | ? {$_ -match 'guest#'} | % {$_.split('#') | select -Last 1} | select -Unique ) -join ", "
        AADExternaluserUPN = ($_.grantedtoidentitiesv2.siteuser.loginname | ? {$_ -match '#ext#'} | % {$_.split('|') | select -Last 1} | select -Unique ) -join ", "
        } | % {if(($_.scope -eq "users") -and ($_.guestusermail -eq "") -and ($_.AADExternaluserUPN -eq "")){}else{$_}} # filter out entries shared only with org users
    }
   } 
}

$HTMLHeader = @"
<style>
TABLE {border-width: 1px; border-style: solid; border-color: black; border-collapse: collapse;}
TH {border-width: 1px; padding: 3px; border-style: solid; border-color: black;}
TD {border-width: 1px; padding: 3px; border-style: solid; border-color: black;}
</style>
"@
 
 if ($All){
$obj_report = foreach ($site in $spSites){Write-host "querying site $($site.weburl)" -ForegroundColor Yellow ;Get-SPSiteDocSharingReport $site.id}
$obj_report | ConvertTo-Html -Head $HTMLHeader | Out-File $ReportFile
}

 if ($SiteUrl){
   $siteToQuery = $spSites.Where({$_.webUrl -eq $SiteUrl})
   if ($siteToQuery){Get-SPSiteDocSharingReport $siteToQuery.id | ConvertTo-Html -Head $HTMLHeader | Out-File $ReportFile }else{Write-Host -ForegroundColor Red "Site not found"} 
 }

Example result:

Example output for Get-SPExternalSharingReport

Explained

The script starts with the authentication process. Nothing new here, except for the Check-AccessToken function which will be used before each webrequest:

Check-AccessToken

Access tokens tipically expire in 1 hour so it needs to be refreshed during script execution (if running for more than 1 hour). This little function will renew the access token 5 minutes before expiration.

First step is to query all sites and store it in a variable (@odata.nextlink logic explained below):

Then we declare the Get-SPSiteSharedDocuments function, which does the following:

  • queries the lists for the site specified (document libraries are list items too)
  • selects those lists that are based on a document library template (based on my research the template is mySiteDocumentLibrary for OneDrive and documentLibrary for SharePoint sites)
  • because it is possible to have multiple document libraries in a site, we loop through each library and query the items in the list (MS doc here)
  • if there are more than 200 items in a list the result are paged which is reflected in the response – it contains the url for the next page in @odata.nextLink, so we use a while statement to go through each site until the response does not containt this @odata.nextLink member
  • at the end of the function only those driveitems are selected which have a member named “shared”
Get-SPSiteSharedDocuments

Next function is the Get-SPSharedDocumentPermission. This function needs the driveID and the documentID as parameter and returns only those items that have a member named “link”. Some things to note:
– MS doc on the permissions API request (here) states the following:
The permissions relationship of DriveItem cannot be expanded as part of a call to get DriveItem or a collection of DriveItems. You must access the permissions property directly.
This is why it is called separately.
– SharePoint content shared externally is always link based (as far as I know) this is why only those items are selected

Get-SPSharedDocumentPermissions

The last function creates the report itself. The Get-SPSiteDocSharingReport creates a pscustomobject with the information displayed in the HTML. There are some points that are not the most beautiful (this may be due to my lack of hardcore scripting skills), but let me try to explain 🙂
Path: the original answer didn’t seem to contain a relative path, so this one is derived from the parentreference of the driveItem, example:

HasPassword: if the document sharing is protected with a password, than this one is reflected here (not visible on the native report)
ExpirationDate: if the sharing link is valid for a limited time, than expiration is shown here (not visible on the native report)
Scope: anonymous (anyone with the link can access), user (only the user specified can open), organisation (shared with everyone in the org – in an external sharing report this can be relevant if you have guest users in the tenant)
GuestUserMail and AADExternalUserUPN: this took me some time to figure out. Link based sharings have the grantedtoidentitiesv2 property (link). This property may contain user and siteuser objects, user is a microsoft.graph identity while siteuser is a sharePointIdentity (link). It means that (I guess) every invitee gets a siteuser identity but those that can be mapped to an AzureAD identity will be represented as a user object too. When B2B Invitation Manager is enabled then these two will contain the same entries. My experience is that the mapped user object doesn’t get its email attribute populated until the invitee opens the link (the native report only shows displayname for these entries). So to include all invitee, I decided to rely on siteuser.loginname which is not too human readable but can be parsed. If it contains ‘guest#’ then the email address is extracted; ‘#ext#’ refers to an AzureAD guest user and its UPN is returned.

Because internal sharing can be based on link too, entries where the scope is user but either GuestUserMail or AADExternalUserUPN is populated (= only shared with org users) are filtered out.

Get-SPSiteDocSharingReport

The rest of the script is just the header for the HTML report and the execution of these functions. When using the -All parameter, the script will display the URL of the site being queried. When passing an invalid URL to the -SiteURL parameter, the script will display a “Site not found” message. Invalid here is a URL which is not in the $spSites variable.

The rest of the script

I really want to emphasise that many findings here are based on experience and testing – but not based on documentations (except where I refer to MS docs). You may eventually want to countercheck the results against the native reports.

Conditional Access policies – do you backup them ALL?

This will be a short post about a recent finding: AzureAD Conditional Access policies created from template may miss from your backups if not using Graph API beta endpoint.

TL;DR
– When you create a Conditional Access policy using the “New policy from template (Preview)” button, the policy will not show when querying policies using the “traditional tools”
– This may also apply anytime you have preview features set in your policy
– You may want to check if all your policies are backed up
– To switch to the beta endpoint in Microsoft Graph PowerShell, use the Select-MgProfile -Name “beta” command

Explained

I was creating a new Conditional Access policy for a customer where I have my own script running as a scheduled task to backup these settings. When there is a change I get notified and it became strange that this time I did not receive such alert. I started to investigate the issue, no errors, but even the “full” backup did not notice the new policy.

Time to reproduce it in sandbox environment. Here are 4 policies, one is created from template:

Policy created from template

When querying the policies this one is missing:

Results using Get-AzureADMSConditionalAccessPolicy

Same results with Graph PowerShell:

Results using Get-MgIdentityConditionalAccessPolicy

Now it is time to open up F12 developer tools to see what is the trick. Opening the policy let me find the policy ID:

Finding the policy ID using F12 Dev Tools

Now if I try to query the policy by ID, I get the following error message:

Get-AzureADMSConditionalAccessPolicy -PolicyId '09981539-1959-4b4a-8543-1f71bc34217d'
Get-AzureADMSConditionalAccessPolicy : Error occurred while executing GetAzureADMSConditionalAccessPolicy
Code: BadRequest
Message: 1037: The policy you requested contains preview features. Use the Beta endpoint to retrieve this policy.
InnerError:
  RequestId: 1016ffe3-6338-4eaf-b334-4595a6023a6c
  DateTimeStamp: Tue, 28 Feb 2023 16:47:18 GMT
HttpStatusCode: BadRequest
HttpStatusDescription: Bad Request
HttpResponseStatus: Completed
At line:1 char:1
+ Get-AzureADMSConditionalAccessPolicy -PolicyId '09981539-1959-4b4a-85 ...
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
    + CategoryInfo          : NotSpecified: (:) [Get-AzureADMSConditionalAccessPolicy], ApiException
    + FullyQualifiedErrorId : Microsoft.Open.MSGraphV10.Client.ApiException,Microsoft.Open.MSGraphV10.PowerShell.GetAzureADMSConditionalAccessPolicy
Error message

Same error with Graph PowerShell:

Get-MgIdentityConditionalAccessPolicy -ConditionalAccessPolicyId '09981539-1959-4b4a-8543-1f71bc34217d'                                                                                                                     Get-MgIdentityConditionalAccessPolicy : 1037: The policy you requested contains preview features. Use the Beta
endpoint to retrieve this policy.
At line:1 char:1
+ Get-MgIdentityConditionalAccessPolicy -ConditionalAccessPolicyId '099 ...
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
    + CategoryInfo          : InvalidOperation: ({ ConditionalAc...ndProperty =  }:<>f__AnonymousType35`3) [Get-MgIden
   tityC...cessPolicy_Get1], RestException`1
    + FullyQualifiedErrorId : BadRequest,Microsoft.Graph.PowerShell.Cmdlets.GetMgIdentityConditionalAccessPolicy_Get1

I don’t know how to use the beta endpoint in AzureAD PowerShell, but since it is being deprecated, I will rely on the Graph Powershell cmdlet which has the Select-MgProfile option to switch to the beta endpoint (link):

Select-MgProfile -Name "beta"

And here we go, now these policies are retrieved too:

Policies before and after switching to beta endpoint

So if you have a policy backup solution in place, take a look on those backups if you are using templates or other preview features.

Intune App configuration policy – Edge/Chrome URLBlocklist on Android: ‘Expected list value’ error

It’s funny when you are planning to post about a topic, then you encounter an error and then publish about the error instead. This was the case when I was comparing the app configuration policies for Edge and Chrome and came to the ‘Expected list value’ issue when trying to set up the URLBlocklist (sidenote below).

TL;DR
– URLBlocklist should be an array, not a string
– Intune’s ‘Configuration designer’ doesn’t allow modifying the value type, use the JSON editor
– Change the URLBlocklist managedProperty from valueString to valueStringArray type and make sure the value is an array of strings, example:

Hint: keep a copy of the JSON config because Intune will try to compile the configuration against the predefined schema which will result in an empty configuration when attempting to modify the JSON data.

Explained

Managed configurations (formerly ‘application restrictions’) can be deployed along with the Android app, if the application has these settings defined. These configuration items are exposed to the EMM partners by using Google Play APIs [so app configuration is not some Intune magic, it’s the beauty of Android Enterprise]. When you are setting up an app configuration policy (for managed devices) the configuration keys displayed are actually read from the application’s app_restrictions.xml file.

Google Chrome managed configuration items

For Edge and Chrome there is a managedProperty called URLBlocklist aka ‘Block access to a list of URLs’. As you can see the Value type here is string:

In the following example, I’m trying to block my webpage and Facebook via this setting:

Blocklist using configuration designer
Settings opened in JSON editor

Going forward the settings are now deployed, you open edge://policy or chrome://policy and you see the following:

‘Expected list value’ error

At this point we should understand how the setting is configured actually, referring to Chrome Enterprise (link). The URLBlacklist property is a list of strings (array), hence it does not accept a string. Now that we know what data type is needed, we should figure out the name of the data type (I did the Google for you, this is ValueStringArray).

Going back to Intune, open the JSON editor and change the data type and the value:

Before (left) and after (right)

This time, there is no error and the configuration works as expected:

URLBlocklist with valueStringArray data type
Facebook blocked

Unfortunately, when you try to modify the JSON, the settings are cleared so make sure you have a copy of your settings:

JSON configuration when trying to edit the previous settings


Sidenote: URL blocklist can also be specified for Edge using Managed app configuration, but it requires Intune App Protection Policy (APP). There are some rare scenarios where you don’t want to apply or can’t apply APP (eg. dedicated devices without Shared device mode)

Nextcloud with AzureAD Application Proxy

There are certain scenarios where Microsoft’s OneDrive/SharePoint solution is not an option for storing data (eg. data localization restrictions enforced by law). However, if you still want to provide your users with the file sync experience and/or other collaboration features you may have came across Owncloud or Nextcloud as an alternative. But have you considered publsihing it via Azure AD Application Proxy?

In this post, I will install a Nextcloud server on Ubuntu and integrate it with AzureAD and publish with the AzureAD Application Proxy service.

Prerequisites:

  • custom DNS domain
  • certificate for Nextcloud app (eg. nextcloud.f12.hu)
  • VM for Nextcloud server, ubuntu server installer
  • Windows Server VM for Application Proxy connector
  • Azure AD Premium P1 licence (or other bundles including AAD P1)

TL;DR

  • install a new Ubuntu server with nextcloud snap
  • do the basic setup for nextcloud, including https
  • from the App bundles install “SSO & SAML authentication”
  • install Azure AD App Proxy connector on a Windows Server with direct line-of-sight on the Nextcloud server (HTTPS access should be enough)
  • DNS setting (split brain DNS): the Windows Server with the App Proxy connector should resolve the FQDN of the app (eg.: nextcloud.f12.hu) to the private address of the Nextcloud server.
  • on the Application proxy page – “Configure an app” – create the Nextcloud app and configure SAML SSO (there is a very good and detailed post on this by Nate Russel, I will cover the required steps too)
  • configure SAML settings on the Nextcloud server (as per in the previous link or see below)

This post will NOT cover:

  • proper storage configuration for Nextcloud and other design principles
  • user matching with previously configured accounts in the Nextcloud environment
  • preparing certificate
  • Active Directory integration (AzureAD only)

Infrastructure basic setup

I’m not an Ubuntu expert, so I left everything as default/recommended during installation – only thing to mention here is that I chose to install the nextcloud snap. While waiting for installation, I created a DNS entry (nextcloud.f12.hu) pointing to the internal IP address of the Nextcloud server.

When the installation was ready, I navigated to the website (http://<ipaddress of the server>) and created the admin account:

Initialization

Next step is to set up the listener (answer to nextcloud.f12.hu not only by IP).

nano /var/snap/nextcloud/current/nextcloud/config/config.php
config.php trusted_domains

Upload the certificate, key and chain then enable https for nextcloud:

nextcloud.enable-https custom <cert> <key> <chain>
enable-https

Now we are ready:

HTTPS enabled

Now navigate to Apps – App bundles and install “SSO & SAML authentication” package

SAML authentication package download

After that navigate to Settings – SSO & SAML authentication and select “Use built-in SAML authentication”

And stop here, but don’t close the browser window, we will come back soon.

Azure Active Directory configuration

Navigate to portal.azure.com – Azure Active Directory – Application proxy. After downloading the connector and installing it on a Windows Server, the service is now enabled. (Very briefly: this server will be the proxy who will “receive the request” and translate the URL to the internal URL specified below. The trick is that the internal and the external URL is the same (this allows consistent user experience).

So click on Configure an app:

configure an app

On the “Add your own on-premises application” enter the name and the URL values (reminder: internal and external URLs are the same) – and don’t forget to register the CNAME entry in the public DNS zone

Adding the Nextcloud application

One sidenote: using Azure Active Directory Pre Authentication is fine until you are planning to access Nextcloud from browser only. If you need the sync client to work, this needs to be switched to Passthrough (with security considerations in mind).

When ready, click Create then on the same screen you can upload the PFX certificate

certificate upload

Next, take care of assignments – you can either assign it to a set of users (Users and groups tab) or in the Properties tab, you can set the “Assignment required” setting to No

Assignment required-No

Next, click on Single sign-on and select SAML authentication. The Basic SAML Configuration blade should include the Entity ID and the Reply URL(s) with the following values:

Basic SAML configuration

On the SAML certificates blade you can either download the Federation Metadata XML and copy the value if the <X509Certificate> tag as per in the tutorial linked above or you can download the Base64 encoded certificate, open it with notepad and remove the –BEGIN CERTIFICATE–, –END CERTIFICATE– lines and the linebrakes. Either way, this certificate will be needed on the Nextcloud SAML configuration.

SAML Certificate download

Now that we have the certificate, navigate back to the Nextcloud SAML settings and fill in the values as per in the tutorial of Nate

SAML setup

And the final step is to enable SAML:

Enable SAML

Now when we navigate to the Nextcloud server’s URL we will be redirected to the Microsoft login portal and that’s it. (Hint: as it is an Enterprise application you can apply Conditional Access policies to make it more secure, just keep in mind the considerations mentioned above about Passthrough pre-authentication)

Monitor AzureAD App registration expiration with PowerShell (GraphAPI)

There are several methods for monitoring Azure AD App registration expiration (like PowerAutomate or Azure Logic Apps) but these methods require extra licences or an Azure subscription. The PowerShell way is free and it only requires a new registration in AzureAD.

TL;DR

  • Create a new app registration with Microsoft Graph Application.Read.All application permission
  • Add a client secret to the app, copy the secret as it will be used in the script
  • Use the script below, fill the variables $tenantID, $appID, $appSecret, $daysBeforeExpiration
  • The output is a PSCustomObject, it is up to you to process it (the example below converts it to a bordered HTML table which is then sent in email – make sure to correct the parameters when using it)

The script:

$tenantID = ''
$appID = ''
$appSecret = ''
$daysBeforeExpiration = 30

#Ensure TLS1.2 is used
[Net.ServicePointManager]::SecurityProtocol = [Net.SecurityProtocolType]::Tls12

#Get access token
$scope = 'https://graph.microsoft.com/.default'
$oAuthUri = "https://login.microsoftonline.com/$TenantId/oauth2/v2.0/token"
$body = [Ordered] @{
    scope = "$scope"
    client_id = "$appId"
    client_secret = "$appSecret"
    grant_type = 'client_credentials'
}
$response = Invoke-RestMethod -Method Post -Uri $oAuthUri -Body $body -ErrorAction Stop
$aadToken = $response.access_token

#Query app registrations, store it in $applications variable
 $url = 'https://graph.microsoft.com/v1.0/applications'

    $headers = @{ 
    'Content-Type' = 'application/json'
    'Accept' = 'application/json'
    'Authorization' = "Bearer $aadToken" 
    }

$applications = $null
while ($url -ne $null){
 $json_response =  (Invoke-WebRequest -Method Get -Uri $url -Headers $headers -ErrorAction Stop | ConvertFrom-Json) #use -UseBasicParsing if scheduling with 'NT Authority\Network Service'
 $applications += $json_response.value
 $url = $json_response.'@odata.nextLink'
 }

 #for each app select the latest credentials of each credential type
 $apps_Cred_latest = $null
 $apps_Cred_latest = foreach ($app in $applications){
    [pscustomobject]@{
        Name = $app.displayname
        LatestKey = $app.KeyCredentials.enddatetime | sort -Descending | select -First 1 
        LatestPass = $app.PasswordCredentials.enddatetime | sort -Descending | select -First 1 


    }
    $app = $null
}

#select apps that are expiring within the range defined in $daysBeforeExpiration
$expiringApps = $apps_Cred_latest | ? {($_.LatestKey,$_.LatestPass -ne $null)} | ? {[datetime]($_.LatestKey, $_.latestPass | sort -Descending | select -First 1) -le (Get-date).AddDays($daysBeforeExpiration) } 


if ($expiringApps){
$Header = @"
<style>
TABLE {border-width: 1px; border-style: solid; border-color: black; border-collapse: collapse;}
TH {border-width: 1px; padding: 3px; border-style: solid; border-color: black;}
TD {border-width: 1px; padding: 3px; border-style: solid; border-color: black;}
</style>
"@

[string]$html_expiringApps = $expiringApps | convertto-html -Head $Header

Send-MailMessage -From "daniel@f12.hu" -To "reports@f12.hu" -SmtpServer smtp.f12.hu -Port 25 -Subject "Azure AD credentials expiring in $daysBeforeExpiration days" -Body $html_expiringApps -BodyAsHtml -Encoding UTF8
}