Halloween is over but the the world is still a scary place. In a continuation of a thought from an article I wrote a few weeks back for Auditing AzureSQL Firewall Policies, I thought I would also include a short function for auditing azure storage accounts that are currently configured for “public” access.

Why are public access containers such a big deal? Simply put, anyone that knows the URL to the container and file can download that file. This is perfectly fine for public sites and public data (albeit you are paying for all the egress bandwidth associated with those outbound transfers in Azure… which you may not want to do). Any container that is marked as public access should have a documented business reason for why it is configured that way and any container that holds even the most slightly sensitive data shouldn’t be marked as public. I would go so far as to say, if there isn’t a particularly good reason for a blob container to be public, it’s better to just play it safe and mark it private. At the very least, you should be aware of it and keep tabs on this on a regular basis.

Okay, #dismount soap box# – on to the code!
Below is a function I wrote to make auditing your public containers easy…

Function Get-PublicAccessContainers {
    $Subscriptions = Get-AzSubscription

    Function Use-Subscription ($SubscriptionName) {
        $DefaultSubscription = Get-AzContext | Select-Object -ExpandProperty Subscription | Select-Object -ExpandProperty Name
        $SubscriptionList = Get-AzSubscription | Select-Object -ExpandProperty Name
        If (!$SubscriptionName) {
            Return
        }
        If (!$SubscriptionList.Contains($SubscriptionName)) {
            Exit
        }
        Else {
            Select-AzSubscription $SubscriptionName
            Return
        }
    }

    ForEach ($subscription in $Subscriptions) {
        Use-Subscription $subscription.name | out-null
        $StorageAccounts = Get-AzStorageAccount
        Foreach ($StorageAccount in $StorageAccounts) {
            $RGname = $StorageAccount.ResourceGroupName
            $SAname = $StorageAccount.StorageAccountName
            $SubscriptionName = $subscription.name
            $Containers = Get-AzRmStorageContainer -ResourceGroupName $RGname -StorageAccountName $SAname
            ForEach ($Container in $Containers) {
                If ($Container.PublicAccess -eq "Blob") {
                    $Container | Select-Object Name, PublicAccess, LastModifiedTime, StorageAccountNAme, @{l="Subscription";e={$SubscriptionName}}
                }
            }
        }
    }
}

After loading the above function, you would then run it and pipe it out to a CSV for easy reading like so:

Get-PublicAccessContainers | Export-CSV "C:\output\PublicContainers.csv"

The function does the following:

  1. Gets a list of all the subscriptions your account has access to.
  2. Walks each of those subscriptions and gets a list of storage accounts in each one.
  3. For each of the storage accounts it finds, it then enumerates all of the containers that are configured for public access. It also provides the subscription name and storage account they belong to for easy filtering.



You get brownie points if you configure the above in an Automation Account, run it on a schedule, and attach the resulting CSV to an email sent to your security department. Be the kid on Halloween that rides in your parent’s car from house to house, don’t walk if you don’t have to.

I like to run this along with the Azure firewall auditing bit I talked about in my previous post. Knowing what is being done at the edge(s) of your environment is critical for good security. The cloud has (like so many other things) radically changed where those edges and security boundaries are and also in many cases multiplied them substantially.

Cheers!

This post has no comments. Be the first to leave one!

Join the discussion

Your email address will not be published. Required fields are marked *