Azure Analysis Services Tracer (aka AzureASTrace.exe)

Recently we had the need to analyse the queries made by the users on Azure Analysis Services and to cross reference that data with Azure AS metrics. For example to see exactly which queries are the cause for high QPU’s or Memory and see who made them on which application.

Currently Azure AS allows you to configure an Extended Events session to collect events from your Analysis Services database:


But there’s no easy way to export or save that data to do some further analysis. You can only watch live data and it’s not very user friendly:


We tried to use the good old ASTrace but it’s not compatible with Azure Analysis Services and it’s not a very good practice because it basicaly create a Profiler Session that will be deprecated soon.

Because we desperately needed to analyse the user queries to identify bottlenecks my amazing BI team at DevScope build an great tool called “Azure-As-Trace” that will allow you to point to a Analysis Services Instance and instantly start collecting the events you want and store them in the file system in JSONL format.

You can download it or contribute to it at github:

It’s very simple to use you just need to download the binaries and change in the config file ‘AzureASTrace.exe.config’ the following parameters:

ConnectionStrings/AnalysisServices The connection string to the Analysis Services instance you want to monitor
AppSettings/XEventTemplateFilePath The path to the XEvents trace template to create the monitoring session on the Analysis Services Instance
AppSettings/OutputFolder The path to the Output Folder that will hold the JSONL files


After that you have two options:

  • Run AzureASTracer as a console application, by simply executing AzureASTrace.exe


  • Run AzureASTracer as a windows service by running ‘setup.install.bat’ and start the service


Either way when running the events will be saved on this on the Output folder, AzureASTrace will create a file for every Event Type subscribed and group the files by day:


Now you can analyze those events in Power BI (comming soon) very easily…

Automatically pause/resume and scale up/down Azure Analysis Services using AzureRM.AnalysisServices

One of the big advantages of Azure Analysis Services is the ability to pause/resume and scale up/down as needed, this will allow you to pay only for what you use and greatly reduce costs.

Azure Analysis Services team released a PowerShell module “AzureRM.AnalysisServices” with cmdlets to manage your Azure Analysis Services resources and they could be more easy to use:

  • Get-AzureRmAnalysisServicesServer – To get your server metadata and current status
  • Suspend-AzureRmAnalysisServicesServer – To suspend a server
  • Resume-AzureRmAnalysisServicesServer – To Resume a server
  • Set-AzureRmAnalysisServicesServer – To update a server, ex: change the SKU

More details here.

But this is only effective if we somehow automate this operation, it’s not feasible if someone on the team or customer is actively pausing/resuming or scaling up/down the instance

With that in mind we build a very simple PowerShell script where you configure in which time and days the Azure AS should be on and on which SKU.

Download the full script here.

The script is configured by a JSON metadata:


The above metadata will configure Azure AS to:

Days Hours SKU
Mon-Fri 8 AM to 18 PM (peak hours) S1
Mon-Fri 18 PM to 00 AM (off peak) S0
Sat-Sun 8 AM to 00 AM S0
Other Other Pause

The powershell script has the following parameters:

-resourceGroupName The name of the Azure Resource Group your Azure AS server is deployed:


-serverName The name of the Azure AS Server:


-configStr The JSON metadata config string


The path to an Azure profile stored locally using the “Save-AzureRmContext” cmdlet.

This is useful to test the script locally.



The name of Azure Connection if you want to run this script in a Azure Automation RunBook:


Probably most of you will want to run this script on an PowerShell Runbook in Azure Automation, learn how to setup the Azure Automation here.

Next you will need to register the module “AzureRM.AnalysisServices” as an Asset of your automation account:


After that just create a new PowerShell runbook:


Paste the script and remember to set the parameter -azureRunAsConnectionName:


Publish the script and create a schedule:



That’s it! You know have your Azure AS automatically pausing/resuming and scalling up/down using a configuration file you defined.

Now just lay back and measure the savings at the end of the month!

Automate Azure Analysis Services Pause/Resume using PowerShell

This is a guest post from my colleague Filipe Sousa from DevScope who will share is very recent findings in automating management tasks in Azure Analysis Services, take it away Filipe:

Recently we came across the need to use one of the newest Azure services – Azure Analysis Services (AS). This lead us to an awesome Software as a Service (SaaS), dazzling query speed, stunning scalability…and a new administration paradigm, administer SaaS in the cloud.

Since Azure Analysis Services is charged hourly and we know that we will not use the service 24/7, how could we automate the pause/resume feature of the Azure Analysis Service so that we could optimize savings?

Couldn’t be more straightforward, except for some lack of documentation/examples, thanks Josh Caplan for pointing us in the right direction: Azure Analysis Services REST API

First, and so that the REST calls to the ARM API can be authenticated, we need to create an app account in the Azure AD. This can be done manually, as a standalone act or, better yet, as part of an Azure Automation Account with a Run as Account creation. The last will deploy a new service principal in Azure Active Directory (AD) for us, a certificate, as well as assigns the contributor role-based access control so that ARM can use it in further runbooks.

Recap, we will need:

An Azure Automation Account so that we can have:

· Runbook(s) – for the exercise, specifically a powershell runbook;

· A run as account so that the script can authenticate against Azure AD;

· Schedules to run the runbook.

This is how you can achieve it:


(If you already have automation account and don’t have a run as account, create an Application Account in Azure AD.)

Having created the azure automation account, we can peek at the new run as account with the service principal already created for us:


Additionally, we can take the opportunity to gather the application, tenant and subscription id’s, it will serve us latter.

Having the Automation Account in Place is time to create a key for it, go to your app account in Azure AD, in the all settings menu select keys and create a new key with the desired duration. Copy the key value and save it somewhere safe, you won’t be able to get it later!


For now, all we have to do is to collect:

· ApplicationID: in Azure AD –> App Registratons –> The name of app we just created

· Application Key: Collected from the previous steps

· TennantID: Azure Active Directory –> Properties –> Directory ID value

· SubscriptionID: From the Azure URL:…

· Resource group name: From the Azure URL:…/resourceGroups/xxxResourceGroup/…

· SSAS server name: Analysis Services -> YourServerName

Having those, replace this values in the below script and save it somewhere for now – we encourage you to develop and test your powershell scripts in powershell IDE –, and yes, this script will also work in an on-premises machine.

#region parameters
            [Parameter(Mandatory = $true)]
            [System.String]$action = 'suspend',

            [Parameter(Mandatory = $true)]
            [System.String]$resourceGroupName = 'YouResourceGroup',

            [Parameter(Mandatory = $true)]
            [System.String]$serverName = 'YourAsServerName'

#region variables 
    $ClientID       = 'YourApplicationId'
    $ClientSecret   = 'YourApplicationKey'
    $tennantid      = 'YourTennantId' 
    $SubscriptionId = 'YourSubsciptionId'

#region Get Access Token
    $TokenEndpoint = {{0}/oauth2/token} -f $tennantid 
    $ARMResource = ""

    $Body = @{
            'resource'= $ARMResource
            'client_id' = $ClientID
            'grant_type' = 'client_credentials'
            'client_secret' = $ClientSecret

    $params = @{
        ContentType = 'application/x-www-form-urlencoded'
        Headers = @{'accept'='application/json'}
        Body = $Body
        Method = 'Post'
        URI = $TokenEndpoint

    $token = Invoke-RestMethod @params

#region Suspend/Resume AS -> depending on the action parameter
    #POST /subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.AnalysisServices/servers/{serverName}/resume?api-version=2016-05-16

    #POST /subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.AnalysisServices/servers/{serverName}/suspend?api-version=2016-05-16

    $requestUri = "$SubscriptionId/resourceGroups/$resourceGroupName/providers/Microsoft.AnalysisServices/servers/$serverName/$action ?api-version=2016-05-16"

    $params = @{
        ContentType = 'application/x-www-form-urlencoded'
        Headers = @{
        'authorization'="Bearer $($Token.access_token)"
        Method = 'Post'
        URI = $requestUri

    Invoke-RestMethod @params


With the powershell script assembled – note that one of script parameters is the action (suspend/resume), that we want the script to execute against the SSAS – the next steps are:

· Create a runbook within the early created automation account with the type powershell, paste the previous script, save it and…voilà, ready to test, publish and automate!


· Next step is to publish the runbook so that it can be used in a schedule, fully automating the suspend/resume procedure. After publishing the runbook, create/assign it schedules – one to suspend and other to resume the AS server:


Afterwards configure the desired script parameters for each schedule:


The final result should look like this and give us the desired suspend/resume Azure AS automation.


Hope that you’ve learned from our post, have a nice Azure automation, leave your comments below!

Filipe Sousa

Azure Storage Explorer

Just to recomend a nice explorer for the Azure Storage:

I’ve been using it for a while and it seems to be the best tool in the moment to explore your data in the Azure Storage or Local Storage.

Any other recomendations are very wellcome?

Azure Table Storage with dynamic entities objects

In a recent project i use Azure Table Storage, but i need to keep the dynamic schema model that is present on the Cloud.

As you might know to use the Azure Table Storage you need to build a class that represent your table schema, a pattern similar to what is done with NHibernate,

In the local development table storage the API automaticaly creates the tables that represent the class you created, similar to what is done with NHibernate. But in the cloud the data is stored using Entity-Attribute-Value (EAV) Tables that means that there’s only one schema and is equal for all tables.

I want to keep this dynamic model and to continue to use the API (StorageClient) present on the AzureServicesKit, the only one that is supported by Microsoft.

So the solution that i found to acomplish this is to implement two events on the TableStorageDataServiceContext class:

  • ReadingEntity – Fired when the context is reading the Entity xml present on the cloud
  • WritingEntity – Fired when the context is about to write an entity to the cloud

And finnaly create a serializable class with a property bag that serializes respecting the schema of an Atom entry.

I implemented a class library in C#, the code is very easy to read and understand.


Examples of usage:

EAVContext context = new EAVContext(); //this class inherits from TableStorageDataServiceContext

List<EAVEntry> entries = context.Entries("TableName").ToList();

EAVEntry entry = new EAVEntry();
entry.AtomUpdated = DateTime.Now;               
entry["TestProperty1"] = "Test";
entry["TestProperty2"] = 999;               
entry["TestDateProperty"] =  DateTime.Now.AddDays(20);

context.AddEntry(TableName, entry);

EAVEntry entry2 = context.Entries(TableName).Where(e => e.PartitionKey == entry.PartitionKey && e.RowKey == entry.RowKey).FirstOrDefault();

entry2["NewProp"] = DateTime.Now;
entry2["NewIntProperty"] = (entry.Properties.ContainsKey("NewIntProperty") ? Convert.ToInt32(entry["NewIntProperty"]) : 0) + 1;



Note: I dint find anywhere a class to serialize and deserialize a Atom Entry, só i created mine (EAVEntry) anyone know a better way?

Hope it helps