Export Power BI Desktop data to CSV Files

A few months ago I published a PowerShell module called PowerBIETL that allows you to export all your Power BI Desktop file data to a SQL Server database:

https://ruiromanoblog.wordpress.com/2016/04/21/use-power-bi-desktop-as-an-etl-tool/

Just updated the module to support export to CSV Files:


Install-Module PowerBIETL

Import-Module PowerBIETL

Export-PBIDesktopToCSV -pbiDesktopWindowName "*PBI Window Name*" -outputPath ".\Output"

After running the powershell code above you will end up with a CSV file for every table on your Power BI Desktop file:

image

Thanks to Rui Quintino for the draft code and Josh Close for creating the amazing CSVHelper!

Advertisements

Automatically Export PBIX’s using #PowerBIPS

Ever wanted to export all your Power BI reports from a workspace and save a copy without manually going to each one and select the “Download report” option:

image

My team at DevScope just updated the PowerBIPS powershell module with a simple cmdlet to do just that:

Export-PBIReport –destinationFolder “<local folder>”

With an easy PowerShell script you can download all the reports from your PowerBI workspaces (ex: Backups?):



Install-Module PowerBIPS

# Get the Auth Token

$authToken = Get-PBIAuthToken 

# Define the Workspace you want to download the reports from - Optional, by default downloads from personal workspace

Set-PBIGroup -authToken $authToken -name "Demos - PBIFromTrenches" -Verbose

# Downloads the reports to a destination folder

Export-PBIReport -authToken $authToken -destinationFolder "C:\Temp\PBIReports" -Verbose

Automatically pause/resume and scale up/down Azure Analysis Services using AzureRM.AnalysisServices

One of the big advantages of Azure Analysis Services is the ability to pause/resume and scale up/down as needed, this will allow you to pay only for what you use and greatly reduce costs.

Azure Analysis Services team released a PowerShell module “AzureRM.AnalysisServices” with cmdlets to manage your Azure Analysis Services resources and they could be more easy to use:

  • Get-AzureRmAnalysisServicesServer – To get your server metadata and current status
  • Suspend-AzureRmAnalysisServicesServer – To suspend a server
  • Resume-AzureRmAnalysisServicesServer – To Resume a server
  • Set-AzureRmAnalysisServicesServer – To update a server, ex: change the SKU

More details here.

But this is only effective if we somehow automate this operation, it’s not feasible if someone on the team or customer is actively pausing/resuming or scaling up/down the instance

With that in mind we build a very simple PowerShell script where you configure in which time and days the Azure AS should be on and on which SKU.

Download the full script here.

The script is configured by a JSON metadata:

image

The above metadata will configure Azure AS to:

Days Hours SKU
Mon-Fri 8 AM to 18 PM (peak hours) S1
Mon-Fri 18 PM to 00 AM (off peak) S0
Sat-Sun 8 AM to 00 AM S0
Other Other Pause

The powershell script has the following parameters:

-resourceGroupName The name of the Azure Resource Group your Azure AS server is deployed:

clip_image002[5]

-serverName The name of the Azure AS Server:

clip_image004

-configStr The JSON metadata config string
-azureProfilePath

(optional)

The path to an Azure profile stored locally using the “Save-AzureRmContext” cmdlet.

This is useful to test the script locally.

-azureRunAsConnectionName

(optional)

The name of Azure Connection if you want to run this script in a Azure Automation RunBook:

clip_image006

Probably most of you will want to run this script on an PowerShell Runbook in Azure Automation, learn how to setup the Azure Automation here.

Next you will need to register the module “AzureRM.AnalysisServices” as an Asset of your automation account:

clip_image002[7]

After that just create a new PowerShell runbook:

clip_image004[4]

Paste the script and remember to set the parameter -azureRunAsConnectionName:

clip_image006[4]

Publish the script and create a schedule:

clip_image008

clip_image010

That’s it! You know have your Azure AS automatically pausing/resuming and scalling up/down using a configuration file you defined.

Now just lay back and measure the savings at the end of the month!

Automate Azure Analysis Services Pause/Resume using PowerShell

This is a guest post from my colleague Filipe Sousa from DevScope who will share is very recent findings in automating management tasks in Azure Analysis Services, take it away Filipe:

Recently we came across the need to use one of the newest Azure services – Azure Analysis Services (AS). This lead us to an awesome Software as a Service (SaaS), dazzling query speed, stunning scalability…and a new administration paradigm, administer SaaS in the cloud.

Since Azure Analysis Services is charged hourly and we know that we will not use the service 24/7, how could we automate the pause/resume feature of the Azure Analysis Service so that we could optimize savings?

Couldn’t be more straightforward, except for some lack of documentation/examples, thanks Josh Caplan for pointing us in the right direction: Azure Analysis Services REST API

First, and so that the REST calls to the ARM API can be authenticated, we need to create an app account in the Azure AD. This can be done manually, as a standalone act or, better yet, as part of an Azure Automation Account with a Run as Account creation. The last will deploy a new service principal in Azure Active Directory (AD) for us, a certificate, as well as assigns the contributor role-based access control so that ARM can use it in further runbooks.

Recap, we will need:

An Azure Automation Account so that we can have:

· Runbook(s) – for the exercise, specifically a powershell runbook;

· A run as account so that the script can authenticate against Azure AD;

· Schedules to run the runbook.

This is how you can achieve it:

clip_image002

(If you already have automation account and don’t have a run as account, create an Application Account in Azure AD.)

Having created the azure automation account, we can peek at the new run as account with the service principal already created for us:

clip_image004

Additionally, we can take the opportunity to gather the application, tenant and subscription id’s, it will serve us latter.

Having the Automation Account in Place is time to create a key for it, go to your app account in Azure AD, in the all settings menu select keys and create a new key with the desired duration. Copy the key value and save it somewhere safe, you won’t be able to get it later!

clip_image006

For now, all we have to do is to collect:

· ApplicationID: in Azure AD –> App Registratons –> The name of app we just created

· Application Key: Collected from the previous steps

· TennantID: Azure Active Directory –> Properties –> Directory ID value

· SubscriptionID: From the Azure URL: https://portal.azure.com/#resource/subscriptions/e583134e-xxxx-xxxx-xxxx-bb8c398a8116/&#8230;

· Resource group name: From the Azure URL: https://portal.azure.com/…/resourceGroups/xxxResourceGroup/&#8230;

· SSAS server name: Analysis Services -> YourServerName

Having those, replace this values in the below script and save it somewhere for now – we encourage you to develop and test your powershell scripts in powershell IDE –, and yes, this script will also work in an on-premises machine.



#region parameters
    param(
            [Parameter(Mandatory = $true)]
            [ValidateSet('suspend','resume')]
            [System.String]$action = 'suspend',

            [Parameter(Mandatory = $true)]
            [System.String]$resourceGroupName = 'YouResourceGroup',

            [Parameter(Mandatory = $true)]
            [System.String]$serverName = 'YourAsServerName'
    )
#endregion


#region variables 
    $ClientID       = 'YourApplicationId'
    $ClientSecret   = 'YourApplicationKey'
    $tennantid      = 'YourTennantId' 
    $SubscriptionId = 'YourSubsciptionId'
#endregion
 

#region Get Access Token
    $TokenEndpoint = {https://login.windows.net/{0}/oauth2/token} -f $tennantid 
    $ARMResource = "https://management.core.windows.net/"

    $Body = @{
            'resource'= $ARMResource
            'client_id' = $ClientID
            'grant_type' = 'client_credentials'
            'client_secret' = $ClientSecret
    }

    $params = @{
        ContentType = 'application/x-www-form-urlencoded'
        Headers = @{'accept'='application/json'}
        Body = $Body
        Method = 'Post'
        URI = $TokenEndpoint
    }

    $token = Invoke-RestMethod @params
#endregion


#region Suspend/Resume AS -> depending on the action parameter
    
    #URI TO RESUME
    #POST /subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.AnalysisServices/servers/{serverName}/resume?api-version=2016-05-16

    #URI TO SUSPEND
    #POST /subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.AnalysisServices/servers/{serverName}/suspend?api-version=2016-05-16

    $requestUri = "https://management.azure.com/subscriptions/$SubscriptionId/resourceGroups/$resourceGroupName/providers/Microsoft.AnalysisServices/servers/$serverName/$action ?api-version=2016-05-16"

    $params = @{
        ContentType = 'application/x-www-form-urlencoded'
        Headers = @{
        'authorization'="Bearer $($Token.access_token)"
        }
        Method = 'Post'
        URI = $requestUri
    }

    Invoke-RestMethod @params

#endregion

With the powershell script assembled – note that one of script parameters is the action (suspend/resume), that we want the script to execute against the SSAS – the next steps are:

· Create a runbook within the early created automation account with the type powershell, paste the previous script, save it and…voilà, ready to test, publish and automate!

clip_image008

· Next step is to publish the runbook so that it can be used in a schedule, fully automating the suspend/resume procedure. After publishing the runbook, create/assign it schedules – one to suspend and other to resume the AS server:

clip_image010

Afterwards configure the desired script parameters for each schedule:

clip_image012

The final result should look like this and give us the desired suspend/resume Azure AS automation.

clip_image014

Hope that you’ve learned from our post, have a nice Azure automation, leave your comments below!

Filipe Sousa

Export Power BI Desktop data to SQL Server

Did you ever faced a scenario were you needed to load a collection of CSV/Text files into SQL Server tables?

What solution did you choose?

  • TSQL BULK INSERT?
  • SSIS Package (generated from SSMS Tasks->Import Data or manual)
  • PowerShell “Import-CSV”

And what if the SQL Server destination tables must be typed (numeric, date, text columns,…) and the CSV file has formatting issues (ex: text columns without quotes, datetimes not in ISO format) and you need to transform the columns into the desired types?

A much quicker solution to transform CSV files into the desired shape is using a PowerBI Desktop query (or PowerQuery), for example in seconds I can:

  • Load the CSV
  • Replace a value from all the columns (in this case “NULL” from a real null)
  • Auto detect the datatypes

PBIDesktopQuery

Now to load these queries into a SQL Server database, it’s very easy thanks to DevScope powershell module “PowerBIETL” (also available at PowerShellGallery):


Install-Module PowerBIETL
Import-Module PowerBIETL

Export-PBIDesktopToSQL -pbiDesktopWindowName "*sample*" -sqlConnStr "Data Source=.\SQL2014; Initial Catalog=DestinationDB; Integrated Security=SSPI" -sqlSchema "stg" -verbose

The cmdlet “Export-PBIDesktopToSQL” will take care of:

  1. Connects to the PBI Desktop and read the tables
  2. Automatically create the tables on the SQL Database (if they do not exist)
    • Thanks to DevScope “SQLHelper” powershell module and “Invoke-SQLBulkCopy” cmdlet
  3. Bulk copy the data from PBI Desktop into the SQL Table

The cmdlet has 4 parameters:

  • -PBIDesktopWindowName (mandatory)
    • A wildcard to find the PowerBI Desktop window
  • -Tables (optional, defaults to all the tables)
    • Array of tables to import
  • -SQLConnStr (mandatory)
    • Connection to a SQL Server database
  • -SQLSchema (optional, defaults to “dbo”)
    • The schema under the tables will be created

As a result all the tables from the PBI Desktop file will get copied into the SQL Server database:

image

Off course this will only work to those “one-time-only” or manual scenarios, but I assure you that is much quicker than using a SQL Integration Services package Winking smile

Twitter Real-Time Analytics in PowerBI with #PowerBIPS

In SQLSaturday Paris I made a demo of Real-Time twitter analytics in PowerBI using the PowerShell module PowerBIPS.

In this post I will show how I did that (you can find the full script at the bottom), the script does the following:

  1. Imports the required modules: #PowerBIPS and #InvokeTwitterAPIs
  2. Save in a variable the Twitter OAuth settings (learn mode here)
  3. Starts a infinite loop that will pool the Twitter data and at each iteration:
    1. Calls the Twitter search API that will search for new tweets containing the #PowerBI hashtag and since the last query
      1. The script stores in a local file the “sinceId” and will use it at a next execution
    2. Parses the Twitter search results and for each tweet does some sentiment analysis using the REST APi of http://www.sentiment140.com
    3. Sends that data to PowerBI using the Out-PowerBI cmdlet of #PowerBIPS

At the end you should be able to do a dashboard similar to this:

image


cls

$ErrorActionPreference = "Stop"

$currentPath = (Split-Path $MyInvocation.MyCommand.Definition -Parent)

# Import the PowerBIPS Powershell Module: https://github.com/DevScope/powerbi-powershell-modules
Import-Module "$currentPath\..\PowerBIPS" -Force

# Import the InvokeTwitterAPIs module: https://github.com/MeshkDevs/InvokeTwitterAPIs
Import-Module "$currentPath\Modules\InvokeTwitterAPIs.psm1" -Force

#region Twitter Settings

# Learn how to generate these keys at: https://dev.twitter.com/oauth and https://apps.twitter.com
$accessToken = "your access token key"
$accessTokenSecret = "your access token secret"
$apiKey = "your api key"
$apiSecret = "your api secret"

$twitterOAuth = @{'ApiKey' = $apiKey; 'ApiSecret' = $apiSecret; 'AccessToken' = $accessToken; 'AccessTokenSecret' = $accessTokenSecret}

#endregion

$sinceId = $null
$sinceIdFilePath = "$currentPath\twitterDemoSinceId.txt"

while($true)
{	
	if (Test-Path $sinceIdFilePath)
	{
		$sinceId = Get-Content $sinceIdFilePath
	}	

	# Hashtags to search (separated by comma) and the number of tweets to return, more examples of search options: https://dev.twitter.com/rest/public/search

	$twitterAPIParams = @{'q'='#powerbi';'count' = '5'}

	if (-not [string]::IsNullOrEmpty($sinceId))
	{
		$twitterAPIParams.Add("since_id", $sinceId)
	}

	# Ger Twitter Data (if SinceId is not Null it will get tweets since that one)

	$result = Invoke-TwitterRestMethod -ResourceURL 'https://api.twitter.com/1.1/search/tweets.json' -RestVerb 'GET' -Parameters $twitterAPIParams -OAuthSettings $twitterOAuth -Verbose

	# Parse the Twitter API data

	$twitterData = $result.statuses |? { [string]::IsNullOrEmpty($sinceId) -or $sinceId -ne $_.id_str } |% {
	
		$aux = @{
			Id = $_.id_str
			; UserId = $_.user.id
			; UserName = $_.user.name
			; UserScreenName = $_.user.screen_name
			; UserLocation = $_.user.location
			; Text = $_.text
			; CreatedAt =  [System.DateTime]::ParseExact($_.created_at, "ddd MMM dd HH:mm:ss zzz yyyy", [System.Globalization.CultureInfo]::InvariantCulture)		
		}

		# Get the Sentiment Score

		$textEncoded = [System.Web.HttpUtility]::UrlEncode($aux.Text, [System.Text.Encoding]::UTF8)

		$sentimentResult = Invoke-RestMethod -Uri "http://www.sentiment140.com/api/classify?text=$textEncoded" -Method Get -Verbose

		switch($sentimentResult.results[0].polarity)
		{
			"0" { $aux.Add("Sentiment", "Negative") }
			"4" { $aux.Add("Sentiment", "Positive") }
			default { $aux.Add("Sentiment", "Neutral") }
		}
		
		Write-Output $aux
	}

	if ($twitterData -and $twitterData.Count -ne 0)
	{
		# Persist the SinceId

		$sinceId = ($twitterData | Sort-Object "CreatedAt" -Descending | Select -First 1).Id
		
		Set-Content -Path $sinceIdFilePath -Value $sinceId

		# Send the data to PowerBI

		$twitterData | Out-PowerBI -dataSetName "TwitterPBIAnalysis" -tableName "Tweets" -types @{"Tweets.CreatedAt"="datetime"} -verbose
	}
	else
	{
		Write-Output "No tweets found."
	}

	Write-Output "Sleeping..."

	Sleep -Seconds 30
}

Using #PowerBIPS to Upload CSV Files to PowerBI

In June I made a guest post on the PowerBI Developer Blog about #PowerBIPS that showed how to upload a collection of local csv file into PowerBI, check it out here:

http://blogs.msdn.com/b/powerbidev/archive/2015/06/08/using-a-power-bi-app-to-upload-csv-files-to-a-dataset.aspx

PS – Thanks Josh 😉

1447.dashboard before4682.explore.png-550x0


cls

$ErrorActionPreference = "Stop"

$currentPath = (Split-Path $MyInvocation.MyCommand.Definition -Parent)

 
#Create Archive Folder 

new-item -Name "Archive" -Force -ItemType directory -Path "$currentPath\CSVData"  | Out-Null

Import-Module "$currentPath\Modules\PowerBIPS" -Force

while($true)

{

	# Iterate each CSV file and send to PowerBI

	Get-ChildItem "$currentPath\CSVData" -Filter "*.csv" |% { 

		$file=$_               

		#Import csv and add column with filename

		$data = Import-Csv $file.FullName | select @{Label="File";Expression={$file.Name}}, *

		# Send data to PowerBI

		$data |  Out-PowerBI -dataSetName "CSVSales" -tableName "Sales" -types @{"Sales.OrderDate"="datetime"; "Sales.SalesAmount"="double"; "Sales.Freight"="double"} -batchSize 300 -verbose

		# Archive the file

		Move-Item $file.FullName "$currentPath\CSVData\Archive\" -Force

	}   

	Write-Output "Sleeping..."

	Sleep -Seconds 5

}