Automate Azure Analysis Services Pause/Resume using PowerShell

This is a guest post from my colleague Filipe Sousa from DevScope who will share is very recent findings in automating management tasks in Azure Analysis Services, take it away Filipe:

Recently we came across the need to use one of the newest Azure services – Azure Analysis Services (AS). This lead us to an awesome Software as a Service (SaaS), dazzling query speed, stunning scalability…and a new administration paradigm, administer SaaS in the cloud.

Since Azure Analysis Services is charged hourly and we know that we will not use the service 24/7, how could we automate the pause/resume feature of the Azure Analysis Service so that we could optimize savings?

Couldn’t be more straightforward, except for some lack of documentation/examples, thanks Josh Caplan for pointing us in the right direction: Azure Analysis Services REST API

First, and so that the REST calls to the ARM API can be authenticated, we need to create an app account in the Azure AD. This can be done manually, as a standalone act or, better yet, as part of an Azure Automation Account with a Run as Account creation. The last will deploy a new service principal in Azure Active Directory (AD) for us, a certificate, as well as assigns the contributor role-based access control so that ARM can use it in further runbooks.

Recap, we will need:

An Azure Automation Account so that we can have:

· Runbook(s) – for the exercise, specifically a powershell runbook;

· A run as account so that the script can authenticate against Azure AD;

· Schedules to run the runbook.

This is how you can achieve it:


(If you already have automation account and don’t have a run as account, create an Application Account in Azure AD.)

Having created the azure automation account, we can peek at the new run as account with the service principal already created for us:


Additionally, we can take the opportunity to gather the application, tenant and subscription id’s, it will serve us latter.

Having the Automation Account in Place is time to create a key for it, go to your app account in Azure AD, in the all settings menu select keys and create a new key with the desired duration. Copy the key value and save it somewhere safe, you won’t be able to get it later!


For now, all we have to do is to collect:

· ApplicationID: in Azure AD –> App Registratons –> The name of app we just created

· Application Key: Collected from the previous steps

· TennantID: Azure Active Directory –> Properties –> Directory ID value

· SubscriptionID: From the Azure URL:…

· Resource group name: From the Azure URL:…/resourceGroups/xxxResourceGroup/…

· SSAS server name: Analysis Services -> YourServerName

Having those, replace this values in the below script and save it somewhere for now – we encourage you to develop and test your powershell scripts in powershell IDE –, and yes, this script will also work in an on-premises machine.

#region parameters
            [Parameter(Mandatory = $true)]
            [System.String]$action = 'suspend',

            [Parameter(Mandatory = $true)]
            [System.String]$resourceGroupName = 'YouResourceGroup',

            [Parameter(Mandatory = $true)]
            [System.String]$serverName = 'YourAsServerName'

#region variables 
    $ClientID       = 'YourApplicationId'
    $ClientSecret   = 'YourApplicationKey'
    $tennantid      = 'YourTennantId' 
    $SubscriptionId = 'YourSubsciptionId'

#region Get Access Token
    $TokenEndpoint = {{0}/oauth2/token} -f $tennantid 
    $ARMResource = ""

    $Body = @{
            'resource'= $ARMResource
            'client_id' = $ClientID
            'grant_type' = 'client_credentials'
            'client_secret' = $ClientSecret

    $params = @{
        ContentType = 'application/x-www-form-urlencoded'
        Headers = @{'accept'='application/json'}
        Body = $Body
        Method = 'Post'
        URI = $TokenEndpoint

    $token = Invoke-RestMethod @params

#region Suspend/Resume AS -> depending on the action parameter
    #POST /subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.AnalysisServices/servers/{serverName}/resume?api-version=2016-05-16

    #POST /subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.AnalysisServices/servers/{serverName}/suspend?api-version=2016-05-16

    $requestUri = "$SubscriptionId/resourceGroups/$resourceGroupName/providers/Microsoft.AnalysisServices/servers/$serverName/$action ?api-version=2016-05-16"

    $params = @{
        ContentType = 'application/x-www-form-urlencoded'
        Headers = @{
        'authorization'="Bearer $($Token.access_token)"
        Method = 'Post'
        URI = $requestUri

    Invoke-RestMethod @params


With the powershell script assembled – note that one of script parameters is the action (suspend/resume), that we want the script to execute against the SSAS – the next steps are:

· Create a runbook within the early created automation account with the type powershell, paste the previous script, save it and…voilà, ready to test, publish and automate!


· Next step is to publish the runbook so that it can be used in a schedule, fully automating the suspend/resume procedure. After publishing the runbook, create/assign it schedules – one to suspend and other to resume the AS server:


Afterwards configure the desired script parameters for each schedule:


The final result should look like this and give us the desired suspend/resume Azure AS automation.


Hope that you’ve learned from our post, have a nice Azure automation, leave your comments below!

Filipe Sousa

Azure Storage Explorer

Just to recomend a nice explorer for the Azure Storage:

I’ve been using it for a while and it seems to be the best tool in the moment to explore your data in the Azure Storage or Local Storage.

Any other recomendations are very wellcome?

Azure Table Storage with dynamic entities objects

In a recent project i use Azure Table Storage, but i need to keep the dynamic schema model that is present on the Cloud.

As you might know to use the Azure Table Storage you need to build a class that represent your table schema, a pattern similar to what is done with NHibernate,

In the local development table storage the API automaticaly creates the tables that represent the class you created, similar to what is done with NHibernate. But in the cloud the data is stored using Entity-Attribute-Value (EAV) Tables that means that there’s only one schema and is equal for all tables.

I want to keep this dynamic model and to continue to use the API (StorageClient) present on the AzureServicesKit, the only one that is supported by Microsoft.

So the solution that i found to acomplish this is to implement two events on the TableStorageDataServiceContext class:

  • ReadingEntity – Fired when the context is reading the Entity xml present on the cloud
  • WritingEntity – Fired when the context is about to write an entity to the cloud

And finnaly create a serializable class with a property bag that serializes respecting the schema of an Atom entry.

I implemented a class library in C#, the code is very easy to read and understand.


Examples of usage:

EAVContext context = new EAVContext(); //this class inherits from TableStorageDataServiceContext

List<EAVEntry> entries = context.Entries("TableName").ToList();

EAVEntry entry = new EAVEntry();
entry.AtomUpdated = DateTime.Now;               
entry["TestProperty1"] = "Test";
entry["TestProperty2"] = 999;               
entry["TestDateProperty"] =  DateTime.Now.AddDays(20);

context.AddEntry(TableName, entry);

EAVEntry entry2 = context.Entries(TableName).Where(e => e.PartitionKey == entry.PartitionKey && e.RowKey == entry.RowKey).FirstOrDefault();

entry2["NewProp"] = DateTime.Now;
entry2["NewIntProperty"] = (entry.Properties.ContainsKey("NewIntProperty") ? Convert.ToInt32(entry["NewIntProperty"]) : 0) + 1;



Note: I dint find anywhere a class to serialize and deserialize a Atom Entry, só i created mine (EAVEntry) anyone know a better way?

Hope it helps