Automate Azure Analysis Services Pause/Resume using PowerShell

This is a guest post from my colleague Filipe Sousa from DevScope who will share is very recent findings in automating management tasks in Azure Analysis Services, take it away Filipe:

Recently we came across the need to use one of the newest Azure services – Azure Analysis Services (AS). This lead us to an awesome Software as a Service (SaaS), dazzling query speed, stunning scalability…and a new administration paradigm, administer SaaS in the cloud.

Since Azure Analysis Services is charged hourly and we know that we will not use the service 24/7, how could we automate the pause/resume feature of the Azure Analysis Service so that we could optimize savings?

Couldn’t be more straightforward, except for some lack of documentation/examples, thanks Josh Caplan for pointing us in the right direction: Azure Analysis Services REST API

First, and so that the REST calls to the ARM API can be authenticated, we need to create an app account in the Azure AD. This can be done manually, as a standalone act or, better yet, as part of an Azure Automation Account with a Run as Account creation. The last will deploy a new service principal in Azure Active Directory (AD) for us, a certificate, as well as assigns the contributor role-based access control so that ARM can use it in further runbooks.

Recap, we will need:

An Azure Automation Account so that we can have:

· Runbook(s) – for the exercise, specifically a powershell runbook;

· A run as account so that the script can authenticate against Azure AD;

· Schedules to run the runbook.

This is how you can achieve it:


(If you already have automation account and don’t have a run as account, create an Application Account in Azure AD.)

Having created the azure automation account, we can peek at the new run as account with the service principal already created for us:


Additionally, we can take the opportunity to gather the application, tenant and subscription id’s, it will serve us latter.

Having the Automation Account in Place is time to create a key for it, go to your app account in Azure AD, in the all settings menu select keys and create a new key with the desired duration. Copy the key value and save it somewhere safe, you won’t be able to get it later!


For now, all we have to do is to collect:

· ApplicationID: in Azure AD –> App Registratons –> The name of app we just created

· Application Key: Collected from the previous steps

· TennantID: Azure Active Directory –> Properties –> Directory ID value

· SubscriptionID: From the Azure URL:…

· Resource group name: From the Azure URL:…/resourceGroups/xxxResourceGroup/…

· SSAS server name: Analysis Services -> YourServerName

Having those, replace this values in the below script and save it somewhere for now – we encourage you to develop and test your powershell scripts in powershell IDE –, and yes, this script will also work in an on-premises machine.

#region parameters
            [Parameter(Mandatory = $true)]
            [System.String]$action = 'suspend',

            [Parameter(Mandatory = $true)]
            [System.String]$resourceGroupName = 'YouResourceGroup',

            [Parameter(Mandatory = $true)]
            [System.String]$serverName = 'YourAsServerName'

#region variables 
    $ClientID       = 'YourApplicationId'
    $ClientSecret   = 'YourApplicationKey'
    $tennantid      = 'YourTennantId' 
    $SubscriptionId = 'YourSubsciptionId'

#region Get Access Token
    $TokenEndpoint = {{0}/oauth2/token} -f $tennantid 
    $ARMResource = ""

    $Body = @{
            'resource'= $ARMResource
            'client_id' = $ClientID
            'grant_type' = 'client_credentials'
            'client_secret' = $ClientSecret

    $params = @{
        ContentType = 'application/x-www-form-urlencoded'
        Headers = @{'accept'='application/json'}
        Body = $Body
        Method = 'Post'
        URI = $TokenEndpoint

    $token = Invoke-RestMethod @params

#region Suspend/Resume AS -> depending on the action parameter
    #POST /subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.AnalysisServices/servers/{serverName}/resume?api-version=2016-05-16

    #POST /subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.AnalysisServices/servers/{serverName}/suspend?api-version=2016-05-16

    $requestUri = "$SubscriptionId/resourceGroups/$resourceGroupName/providers/Microsoft.AnalysisServices/servers/$serverName/$action ?api-version=2016-05-16"

    $params = @{
        ContentType = 'application/x-www-form-urlencoded'
        Headers = @{
        'authorization'="Bearer $($Token.access_token)"
        Method = 'Post'
        URI = $requestUri

    Invoke-RestMethod @params


With the powershell script assembled – note that one of script parameters is the action (suspend/resume), that we want the script to execute against the SSAS – the next steps are:

· Create a runbook within the early created automation account with the type powershell, paste the previous script, save it and…voilà, ready to test, publish and automate!


· Next step is to publish the runbook so that it can be used in a schedule, fully automating the suspend/resume procedure. After publishing the runbook, create/assign it schedules – one to suspend and other to resume the AS server:


Afterwards configure the desired script parameters for each schedule:


The final result should look like this and give us the desired suspend/resume Azure AS automation.


Hope that you’ve learned from our post, have a nice Azure automation, leave your comments below!

Filipe Sousa

Use Power BI Desktop as an ETL Tool

Did you ever faced a scenario were you needed to load a collection of CSV/Text files into SQL Server tables?

What solution did you choose?

  • SSIS Package (generated from SSMS Tasks->Import Data or manual)
  • PowerShell “Import-CSV”

And what if the SQL Server destination tables must be typed (numeric, date, text columns,…) and the CSV file has formatting issues (ex: text columns without quotes, datetimes not in ISO format) and you need to transform the columns into the desired types?

A much quicker solution to transform CSV files into the desired shape is using a PowerBI Desktop query (or PowerQuery), for example in seconds I can:

  • Load the CSV
  • Replace a value from all the columns (in this case “NULL” from a real null)
  • Auto detect the datatypes


Now to load these queries into a SQL Server database, it’s very easy thanks to DevScope powershell module “PowerBIETL” (also available at PowerShellGallery):

Install-Module PowerBIETL
Import-Module PowerBIETL

Export-PBIDesktopToSQL -pbiDesktopWindowName "*sample*" -sqlConnStr "Data Source=.\SQL2014; Initial Catalog=DestinationDB; Integrated Security=SSPI" -sqlSchema "stg" -verbose

The cmdlet “Export-PBIDesktopToSQL” will take care of:

  1. Connects to the PBI Desktop and read the tables
  2. Automatically create the tables on the SQL Database (if they do not exist)
    • Thanks to DevScope “SQLHelper” powershell module and “Invoke-SQLBulkCopy” cmdlet
  3. Bulk copy the data from PBI Desktop into the SQL Table

The cmdlet has 4 parameters:

  • -PBIDesktopWindowName (mandatory)
    • A wildcard to find the PowerBI Desktop window
  • -Tables (optional, defaults to all the tables)
    • Array of tables to import
  • -SQLConnStr (mandatory)
    • Connection to a SQL Server database
  • -SQLSchema (optional, defaults to “dbo”)
    • The schema under the tables will be created

As a result all the tables from the PBI Desktop file will get copied into the SQL Server database:


Off course this will only work to those “one-time-only” or manual scenarios, but I assure you that is much quicker than using a SQL Integration Services package Winking smile

Twitter Real-Time Analytics in PowerBI with #PowerBIPS

In SQLSaturday Paris I made a demo of Real-Time twitter analytics in PowerBI using the PowerShell module PowerBIPS.

In this post I will show how I did that (you can find the full script at the bottom), the script does the following:

  1. Imports the required modules: #PowerBIPS and #InvokeTwitterAPIs
  2. Save in a variable the Twitter OAuth settings (learn mode here)
  3. Starts a infinite loop that will pool the Twitter data and at each iteration:
    1. Calls the Twitter search API that will search for new tweets containing the #PowerBI hashtag and since the last query
      1. The script stores in a local file the “sinceId” and will use it at a next execution
    2. Parses the Twitter search results and for each tweet does some sentiment analysis using the REST APi of
    3. Sends that data to PowerBI using the Out-PowerBI cmdlet of #PowerBIPS

At the end you should be able to do a dashboard similar to this:



$ErrorActionPreference = "Stop"

$currentPath = (Split-Path $MyInvocation.MyCommand.Definition -Parent)

# Import the PowerBIPS Powershell Module:
Import-Module "$currentPath\..\PowerBIPS" -Force

# Import the InvokeTwitterAPIs module:
Import-Module "$currentPath\Modules\InvokeTwitterAPIs.psm1" -Force

#region Twitter Settings

# Learn how to generate these keys at: and
$accessToken = "your access token key"
$accessTokenSecret = "your access token secret"
$apiKey = "your api key"
$apiSecret = "your api secret"

$twitterOAuth = @{'ApiKey' = $apiKey; 'ApiSecret' = $apiSecret; 'AccessToken' = $accessToken; 'AccessTokenSecret' = $accessTokenSecret}


$sinceId = $null
$sinceIdFilePath = "$currentPath\twitterDemoSinceId.txt"

	if (Test-Path $sinceIdFilePath)
		$sinceId = Get-Content $sinceIdFilePath

	# Hashtags to search (separated by comma) and the number of tweets to return, more examples of search options:

	$twitterAPIParams = @{'q'='#powerbi';'count' = '5'}

	if (-not [string]::IsNullOrEmpty($sinceId))
		$twitterAPIParams.Add("since_id", $sinceId)

	# Ger Twitter Data (if SinceId is not Null it will get tweets since that one)

	$result = Invoke-TwitterRestMethod -ResourceURL '' -RestVerb 'GET' -Parameters $twitterAPIParams -OAuthSettings $twitterOAuth -Verbose

	# Parse the Twitter API data

	$twitterData = $result.statuses |? { [string]::IsNullOrEmpty($sinceId) -or $sinceId -ne $_.id_str } |% {
		$aux = @{
			Id = $_.id_str
			; UserId = $
			; UserName = $
			; UserScreenName = $_.user.screen_name
			; UserLocation = $_.user.location
			; Text = $_.text
			; CreatedAt =  [System.DateTime]::ParseExact($_.created_at, "ddd MMM dd HH:mm:ss zzz yyyy", [System.Globalization.CultureInfo]::InvariantCulture)		

		# Get the Sentiment Score

		$textEncoded = [System.Web.HttpUtility]::UrlEncode($aux.Text, [System.Text.Encoding]::UTF8)

		$sentimentResult = Invoke-RestMethod -Uri "$textEncoded" -Method Get -Verbose

			"0" { $aux.Add("Sentiment", "Negative") }
			"4" { $aux.Add("Sentiment", "Positive") }
			default { $aux.Add("Sentiment", "Neutral") }
		Write-Output $aux

	if ($twitterData -and $twitterData.Count -ne 0)
		# Persist the SinceId

		$sinceId = ($twitterData | Sort-Object "CreatedAt" -Descending | Select -First 1).Id
		Set-Content -Path $sinceIdFilePath -Value $sinceId

		# Send the data to PowerBI

		$twitterData | Out-PowerBI -dataSetName "TwitterPBIAnalysis" -tableName "Tweets" -types @{"Tweets.CreatedAt"="datetime"} -verbose
		Write-Output "No tweets found."

	Write-Output "Sleeping..."

	Sleep -Seconds 30

Using #PowerBIPS to Upload CSV Files to PowerBI

In June I made a guest post on the PowerBI Developer Blog about #PowerBIPS that showed how to upload a collection of local csv file into PowerBI, check it out here:

PS – Thanks Josh 😉

1447.dashboard before4682.explore.png-550x0


$ErrorActionPreference = "Stop"

$currentPath = (Split-Path $MyInvocation.MyCommand.Definition -Parent)

#Create Archive Folder 

new-item -Name "Archive" -Force -ItemType directory -Path "$currentPath\CSVData"  | Out-Null

Import-Module "$currentPath\Modules\PowerBIPS" -Force



	# Iterate each CSV file and send to PowerBI

	Get-ChildItem "$currentPath\CSVData" -Filter "*.csv" |% { 


		#Import csv and add column with filename

		$data = Import-Csv $file.FullName | select @{Label="File";Expression={$file.Name}}, *

		# Send data to PowerBI

		$data |  Out-PowerBI -dataSetName "CSVSales" -tableName "Sales" -types @{"Sales.OrderDate"="datetime"; "Sales.SalesAmount"="double"; "Sales.Freight"="double"} -batchSize 300 -verbose

		# Archive the file

		Move-Item $file.FullName "$currentPath\CSVData\Archive\" -Force


	Write-Output "Sleeping..."

	Sleep -Seconds 5


Create a Real-Time IT Dashboard with PowerBIPS

Last week we published on GitHub a powershell module for the new PowerBI developer REST API’s named “PowerBIPS”, download it here:

In this post I will demonstrate the ease of use of this module and show you step-by-step how to build a powershell script that upload server stats to PowerBI allowing the users to build a real-time IT dashboard like this:


The script is divided in the following steps (full script):

  1. Import the PowerBIPS module
  2. Get the Authentication Token
  3. Create a PowerBI DataSet
  4. Upload data to PowerBI

Import the PowerBIPS module

The script starts by importing the powershell module:

# Module need to be installed on %USERPROFILE\WindowsPowershell\Modules\
Import-Module PowerBIPS -Force

Or by pointing directly to the .psm1 file:

Import-Module <path to module>\PowerBIPS.psm1 –Force

After this all the PowerBIPS cmdlets are available for use in the PowerShell console.

Get the Authentication Token

Next we need to get the authentication token that is needed to communicate with the API, this token identifies your PowerBI tenant and grants access to it:

$authToken = Get-PBIAuthToken -clientId "<your client id>"

This cmdlet require you to provide the Client Id of the Native Application that you need to create in the windows azure active directory of yout tenant:

PS -Follow this guide to create an Azure Client App (thanks to mattmcnabb)


When you execute this cmdlet the following popup is shown asking for the credentials to connect to the PowerBI account:


It’s also possible to set in the cmdlet the username and password which is the ideal mode for automation jobs.

Create a PowerBI DataSet

Before creating the DataSet we will test if a DataSet with name “IT Server Monitor” exists in PowerBI, if not create a new one:

$dataSetMetadata = Get-PBIDataSet -authToken $authToken -dataSetName "IT Server Monitor"

if (-not $dataSetMetadata)
  # If cannot find the DataSet create a new one with this schema	
  $dataSetSchema = @{
	name = "IT Server Monitor"
	; tables = @(
		@{name = "Processes"
		; columns = @( 
			@{ name = "ComputerName"; dataType = "String"  }					
			, @{ name = "Date"; dataType = "DateTime"  }
			, @{ name = "Hour"; dataType = "String"  }
			, @{ name = "Id"; dataType = "String"  }
			, @{ name = "ProcessName"; dataType = "String"  }
			, @{ name = "CPU"; dataType = "Double"  }
			, @{ name = "Memory"; dataType = "Double"  }
			, @{ name = "Threads"; dataType = "Int64"  }					

  $dataSetMetadata = New-PBIDataSet -authToken $authToken -dataSet $dataSetSchema -Verbose

The $dataSetSchema variable is a hierarchical hashtable that define the schema of the dataset (a System.Data.DataSet is also supported)

The “New-PBIDataSet” cmdlet in case of success returns the PowerBI dataset metadata with the internal id of the dataset.

Upload data to PowerBI

In this script we will start by deleting all the data in the “Processes” table (optional)

Clear-PBITableRows -authToken $authToken -dataSetId $dataSetMetadata.Id -tableName "Processes"

And each 10 seconds cycle the servers and for each one:

  1. Get all the running processes and related stats: CPU, Memory,…
  2. Upload the collected data to PowerBI
$computers = @("DSRomano")

    $computers |% {
	$computerName = $_
	$timeStamp = [datetime]::Now
	# Dates sent as string because of an issue with ConvertTo-Json (alternative is to convert each object to a hashtable)
	Get-Process -ComputerName $computerName | Select  @{Name = "ComputerName"; Expression = {$computerName}} `
		, @{Name="Date"; Expression = {$timeStamp.Date.ToString("yyyy-MM-dd")}} `
		, @{Name="Hour"; Expression = {$timeStamp.ToString("HH:mm:ss")}} `
		, Id, ProcessName, CPU, @{Name='Memory';Expression={($_.WorkingSet/1MB)}}, @{Name='Threads';Expression={($_.Threads.Count)}} `
	| Add-PBITableRows -authToken $authToken -dataSetId $dataSetMetadata.Id -tableName "Processes" -batchSize -1 -Verbose
     sleep -Seconds 10

After you run the script the following DataSet should appear on your PowerBI account:

image  image

And you can start searching, pining, slicing & dicing with this real time data:




The full PowerShell script can be downloaded here.

A quick ETL PowerShell script to copy data between databases

I’ll start this post by announce that we Just moved our SQLHelper powershell module from CodePlex into GitHub, you can grab it here:

We also made a new sample to demonstrate the simplicity of use of this module, the objective is to build a very simple ETL script that make these steps:

  1. Query a collection of tables from a source database (can be any type: SQL/Oracle/…)
  2. Insert into a SQLServer database using bulk loading
    • If the table does not exist  in the destination, it must be created automatically

The powershell code:

Import-Module ".\SQLHelper.psm1" -Force

$sourceConnStr = "sourceconnstr"

$destinationConnStr = "destinationconnstr"

$tables = @("[dbo].[DimProduct]", "[dbo].[FactInternetSales]")

$steps = $tables.Count
$i = 1;

$tables |% {

    $sourceTableName = $_
    $destinationTableName = $sourceTableName

    Write-Progress -activity "Tables Copy" -CurrentOperation "Executing source query over '$sourceTableName'" -PercentComplete (($i / $steps)  * 100) -Verbose

    $sourceTable = (Invoke-DBCommand -connectionString $sourceConnStr -commandText "select * from $sourceTableName").Tables[0]

    Write-Progress -activity "Tables Copy" -CurrentOperation "Creating destination table '$destinationTableName'" -PercentComplete (($i / $steps)  * 100) -Verbose

    Invoke-SQLCreateTable -connectionString $destinationConnStr -table $sourceTable -tableName $destinationTableName -force

    Write-Progress -activity "Tables Copy" -CurrentOperation "Loading destination table '$destinationTableName'" -PercentComplete (($i / $steps)  * 100) -Verbose

    Invoke-SQLBulkCopy -connectionString $destinationConnStr -data $sourceTable -tableName $destinationTableName                


Write-Progress -activity "Tables Copy" -Completed


Configure Reporting Services SSL Binding with WMI & PowerShell

Recently I’ve been building some scripts in PowerShell  to automate the deployment of business inteligence bits like: SSAS, SSRS, SSIS, Databases,…

And one of the tasks was to configure the SSRS (Reporting Services) with ssl binding. The problem is that the only way to do that is with the SSRS WMI Provider.

With powershell we have a cmdlet “get-wmiobject” that make things a lot easier, this cmdlet builds a “proxy” to a wmi class and let us call the methods in this class.

All we need to do is to obtain an instance of the WMI Class ”MSReportServer_ConfigurationSetting”

$serverClass = get-wmiobject -namespace “rootMicrosoftSqlServerReportServerRS_sql2008v10Admin” -class “MSReportServer_ConfigurationSetting”

Reserve the Url calling the “ReserveURL” method

$result = $serverClass.ReserveURL(“ReportServerWebService”, “https://mysite:443”, 2070)

This method receives the folowing parameters:

  • Name of the SSRS web application: ReportServerWebService or ReportManager
  • The url to be reserved in http.sys
  • The locale id (LCID) of the return messages, in this case I’am using Portugal LCID

Then I need to call the “CreateSSLCertificateBinding” method

$result = $serverClass.CreateSSLCertificateBinding(“ReportServerWebService”, “‎e9b993f5a5101bf9bea71896ffc07118b9ca2dcc”, “”, 443, 2070)    

This method receives the folowing parameters:

  • Name of the SSRS web application
  • The certificate hash or thumbprint
  • The Ip address of the webapplication, in this case I use Wildcard IP address
  • The ssl port
  • The LCID

The sintax of members & methods of the WMI Class ”MSReportServer_ConfigurationSetting” can be found here:

To wrap it up, I will post a complete powershell script solution to this problem:

function Config-SSRSSystemConfiguration($sslUrl, $certHash, $sslPort)

# The .ToLower() avoids the error “A Secure Sockets Layer (SSL) certificate is not configured on the Web site.” (Thanks Michel)

$certHash = $certHash.ToLower()

Write-Output “Configure SSRS SSL binding”

$serverClass = get-wmiobject -namespace “rootMicrosoftSqlServerReportServerRS_sql2008v10Admin” -class “MSReportServer_ConfigurationSetting”

if ($serverClass -eq $null) { throw “Cannot find wmi class” }
$lcid = [System.Globalization.CultureInfo]::GetCultureInfo(“pt-PT”).LCID
$result = $serverClass.RemoveURL(“ReportServerWebService”, $sslUrl, $lcid)
if (!($result.HRESULT -eq 0)) { write-error $result.Error }
$result = $serverClass.ReserveURL(“ReportServerWebService”, $sslUrl, $lcid)
if (!($result.HRESULT -eq 0)) { write-error $result.Error }
$result = $serverClass.RemoveSSLCertificateBindings(“ReportServerWebService”, $certHash, “”, $sslPort, $lcid)
if (!($result.HRESULT -eq 0)) { write-error $result.Error }
$result = $serverClass.CreateSSLCertificateBinding(“ReportServerWebService”, $certHash, “”, $sslPort, $lcid)
if (!($result.HRESULT -eq 0)) { write-error $result.Error }
$result = $serverClass.RemoveURL(“ReportManager”, $sslUrl, $lcid)
if (!($result.HRESULT -eq 0)) { write-error $result.Error }
$result = $serverClass.ReserveURL(“ReportManager”, $sslUrl, $lcid)
if (!($result.HRESULT -eq 0)) { write-error $result.Error }
$result = $serverClass.RemoveSSLCertificateBindings(“ReportManager”, $certHash, “”, $sslPort, $lcid)
if (!($result.HRESULT -eq 0)) { write-error $result.Error }
$result = $serverClass.CreateSSLCertificateBinding(“ReportManager”, $certHash, “”, $sslPort, $lcid)
if (!($result.HRESULT -eq 0)) { write-error $result.Error }

#Create SSL Certificate

$certificatesFolder = “c:Certificates”
$cn = “mysite”

& makecert -r -pe -n CN=”$cn” -eku -ss my -sr localmachine `
-sky exchange -sp “Microsoft RSA SChannel Cryptographic Provider” -sy 12 “$certificatesFolderSSLCert.cer”

$cert = New-Object System.Security.Cryptography.X509Certificates.X509Certificate2(“$certificatesFolderSSLCert.cer”)

$certHash = $cert.Thumbprint

#Configure the SSL binding
Config-SSRSSystemConfiguration “https://$($cn):443″ $certHash 443

In this script I create the certificate with “makecert” command, and I load it to obtain the certificate hash.

In the powershell function “Config-SSRSSystemConfiguration”  before calling the methods (I had described earlier) I call the respective remove method witch signature is similar.

Note: This was tested in SSRS 2008, do not know if it works on SSRS 2005

Hope it helps.