Export Power BI Desktop data to CSV Files

A few months ago I published a PowerShell module called PowerBIETL that allows you to export all your Power BI Desktop file data to a SQL Server database:

https://ruiromanoblog.wordpress.com/2016/04/21/use-power-bi-desktop-as-an-etl-tool/

Just updated the module to support export to CSV Files:


Install-Module PowerBIETL

Import-Module PowerBIETL

Export-PBIDesktopToCSV -pbiDesktopWindowName "*PBI Window Name*" -outputPath ".\Output"

After running the powershell code above you will end up with a CSV file for every table on your Power BI Desktop file:

image

Thanks to Rui Quintino for the draft code and Josh Close for creating the amazing CSVHelper!

Advertisements

Automatically Export PBIX’s using #PowerBIPS

Ever wanted to export all your Power BI reports from a workspace and save a copy without manually going to each one and select the “Download report” option:

image

My team at DevScope just updated the PowerBIPS powershell module with a simple cmdlet to do just that:

Export-PBIReport –destinationFolder “<local folder>”

With an easy PowerShell script you can download all the reports from your PowerBI workspaces (ex: Backups?):



Install-Module PowerBIPS

# Get the Auth Token

$authToken = Get-PBIAuthToken 

# Define the Workspace you want to download the reports from - Optional, by default downloads from personal workspace

Set-PBIGroup -authToken $authToken -name "Demos - PBIFromTrenches" -Verbose

# Downloads the reports to a destination folder

Export-PBIReport -authToken $authToken -destinationFolder "C:\Temp\PBIReports" -Verbose

Azure Analysis Services Tracer (aka AzureASTrace.exe)

Recently we had the need to analyse the queries made by the users on Azure Analysis Services and to cross reference that data with Azure AS metrics. For example to see exactly which queries are the cause for high QPU’s or Memory and see who made them on which application.

Currently Azure AS allows you to configure an Extended Events session to collect events from your Analysis Services database:

image

But there’s no easy way to export or save that data to do some further analysis. You can only watch live data and it’s not very user friendly:

image

We tried to use the good old ASTrace but it’s not compatible with Azure Analysis Services and it’s not a very good practice because it basicaly create a Profiler Session that will be deprecated soon.

Because we desperately needed to analyse the user queries to identify bottlenecks my amazing BI team at DevScope build an great tool called “Azure-As-Trace” that will allow you to point to a Analysis Services Instance and instantly start collecting the events you want and store them in the file system in JSONL format.

You can download it or contribute to it at github: https://github.com/DevScope/Azure-AS-Tracer

It’s very simple to use you just need to download the binaries and change in the config file ‘AzureASTrace.exe.config’ the following parameters:

ConnectionStrings/AnalysisServices The connection string to the Analysis Services instance you want to monitor
AppSettings/XEventTemplateFilePath The path to the XEvents trace template to create the monitoring session on the Analysis Services Instance
AppSettings/OutputFolder The path to the Output Folder that will hold the JSONL files

image

After that you have two options:

  • Run AzureASTracer as a console application, by simply executing AzureASTrace.exe

image

  • Run AzureASTracer as a windows service by running ‘setup.install.bat’ and start the service

image

Either way when running the events will be saved on this on the Output folder, AzureASTrace will create a file for every Event Type subscribed and group the files by day:

image

Now you can analyze those events in Power BI (comming soon) very easily…

Slides of “#PowerBI from the Trenches” at #TugaIT

Hi,

Big thank you to all who attended my session at #TugaIT, I had a great time:

IMG_0547

It was a simple but very practical tip’s & trick’s talk around Power BI.

Download the slides with all the tips here: https://1drv.ms/b/s!AhYJIuS7UANdgb0M4lRWaiMLRkkVWg

Thanks and see you next year!

Export Power BI Desktop data to SQL Server

Did you ever faced a scenario were you needed to load a collection of CSV/Text files into SQL Server tables?

What solution did you choose?

  • TSQL BULK INSERT?
  • SSIS Package (generated from SSMS Tasks->Import Data or manual)
  • PowerShell “Import-CSV”

And what if the SQL Server destination tables must be typed (numeric, date, text columns,…) and the CSV file has formatting issues (ex: text columns without quotes, datetimes not in ISO format) and you need to transform the columns into the desired types?

A much quicker solution to transform CSV files into the desired shape is using a PowerBI Desktop query (or PowerQuery), for example in seconds I can:

  • Load the CSV
  • Replace a value from all the columns (in this case “NULL” from a real null)
  • Auto detect the datatypes

PBIDesktopQuery

Now to load these queries into a SQL Server database, it’s very easy thanks to DevScope powershell module “PowerBIETL” (also available at PowerShellGallery):


Install-Module PowerBIETL
Import-Module PowerBIETL

Export-PBIDesktopToSQL -pbiDesktopWindowName "*sample*" -sqlConnStr "Data Source=.\SQL2014; Initial Catalog=DestinationDB; Integrated Security=SSPI" -sqlSchema "stg" -verbose

The cmdlet “Export-PBIDesktopToSQL” will take care of:

  1. Connects to the PBI Desktop and read the tables
  2. Automatically create the tables on the SQL Database (if they do not exist)
    • Thanks to DevScope “SQLHelper” powershell module and “Invoke-SQLBulkCopy” cmdlet
  3. Bulk copy the data from PBI Desktop into the SQL Table

The cmdlet has 4 parameters:

  • -PBIDesktopWindowName (mandatory)
    • A wildcard to find the PowerBI Desktop window
  • -Tables (optional, defaults to all the tables)
    • Array of tables to import
  • -SQLConnStr (mandatory)
    • Connection to a SQL Server database
  • -SQLSchema (optional, defaults to “dbo”)
    • The schema under the tables will be created

As a result all the tables from the PBI Desktop file will get copied into the SQL Server database:

image

Off course this will only work to those “one-time-only” or manual scenarios, but I assure you that is much quicker than using a SQL Integration Services package Winking smile

Power BI Desktop Trace Logs Analyser

In this post I will show you how to analyse Power BI Desktop diagnostic trace files in a more visual way than notepad Smile

First you need to collect some diagnostics by enabling tracing on Power BI Desktop, go to: File –> Options –> Diagnostics –> Enable Tracingimage

If you click on “Open Traces folder”:

image

It will open the trace folder with all the trace logs:

image

PS – Trace log are only generated after you test your power bi report, do some refresh and interactions first to create the trace logs

Now to analyse these logs you could off course open them in notepad:

image

But is not very easy to read, so what better way to process and visualize this huge amount of text data??? Power BI off course!!!

So I created a Power BI Desktop to process and visualize the trace logs, that will allow you to quickly visualize things like:

  • Errors
  • Duration of queries
  • Performance issues
  • etc

image

image

Instructions of usage:

  • Download and open the Power BI Desktop file
  • “Edit Queries” and change the variable “VAR_LogFolder” to point to the trace logs folder:

image

image

  • Refresh the Report

image

Twitter Real-Time Analytics in PowerBI with #PowerBIPS

In SQLSaturday Paris I made a demo of Real-Time twitter analytics in PowerBI using the PowerShell module PowerBIPS.

In this post I will show how I did that (you can find the full script at the bottom), the script does the following:

  1. Imports the required modules: #PowerBIPS and #InvokeTwitterAPIs
  2. Save in a variable the Twitter OAuth settings (learn mode here)
  3. Starts a infinite loop that will pool the Twitter data and at each iteration:
    1. Calls the Twitter search API that will search for new tweets containing the #PowerBI hashtag and since the last query
      1. The script stores in a local file the “sinceId” and will use it at a next execution
    2. Parses the Twitter search results and for each tweet does some sentiment analysis using the REST APi of http://www.sentiment140.com
    3. Sends that data to PowerBI using the Out-PowerBI cmdlet of #PowerBIPS

At the end you should be able to do a dashboard similar to this:

image


cls

$ErrorActionPreference = "Stop"

$currentPath = (Split-Path $MyInvocation.MyCommand.Definition -Parent)

# Import the PowerBIPS Powershell Module: https://github.com/DevScope/powerbi-powershell-modules
Import-Module "$currentPath\..\PowerBIPS" -Force

# Import the InvokeTwitterAPIs module: https://github.com/MeshkDevs/InvokeTwitterAPIs
Import-Module "$currentPath\Modules\InvokeTwitterAPIs.psm1" -Force

#region Twitter Settings

# Learn how to generate these keys at: https://dev.twitter.com/oauth and https://apps.twitter.com
$accessToken = "your access token key"
$accessTokenSecret = "your access token secret"
$apiKey = "your api key"
$apiSecret = "your api secret"

$twitterOAuth = @{'ApiKey' = $apiKey; 'ApiSecret' = $apiSecret; 'AccessToken' = $accessToken; 'AccessTokenSecret' = $accessTokenSecret}

#endregion

$sinceId = $null
$sinceIdFilePath = "$currentPath\twitterDemoSinceId.txt"

while($true)
{	
	if (Test-Path $sinceIdFilePath)
	{
		$sinceId = Get-Content $sinceIdFilePath
	}	

	# Hashtags to search (separated by comma) and the number of tweets to return, more examples of search options: https://dev.twitter.com/rest/public/search

	$twitterAPIParams = @{'q'='#powerbi';'count' = '5'}

	if (-not [string]::IsNullOrEmpty($sinceId))
	{
		$twitterAPIParams.Add("since_id", $sinceId)
	}

	# Ger Twitter Data (if SinceId is not Null it will get tweets since that one)

	$result = Invoke-TwitterRestMethod -ResourceURL 'https://api.twitter.com/1.1/search/tweets.json' -RestVerb 'GET' -Parameters $twitterAPIParams -OAuthSettings $twitterOAuth -Verbose

	# Parse the Twitter API data

	$twitterData = $result.statuses |? { [string]::IsNullOrEmpty($sinceId) -or $sinceId -ne $_.id_str } |% {
	
		$aux = @{
			Id = $_.id_str
			; UserId = $_.user.id
			; UserName = $_.user.name
			; UserScreenName = $_.user.screen_name
			; UserLocation = $_.user.location
			; Text = $_.text
			; CreatedAt =  [System.DateTime]::ParseExact($_.created_at, "ddd MMM dd HH:mm:ss zzz yyyy", [System.Globalization.CultureInfo]::InvariantCulture)		
		}

		# Get the Sentiment Score

		$textEncoded = [System.Web.HttpUtility]::UrlEncode($aux.Text, [System.Text.Encoding]::UTF8)

		$sentimentResult = Invoke-RestMethod -Uri "http://www.sentiment140.com/api/classify?text=$textEncoded" -Method Get -Verbose

		switch($sentimentResult.results[0].polarity)
		{
			"0" { $aux.Add("Sentiment", "Negative") }
			"4" { $aux.Add("Sentiment", "Positive") }
			default { $aux.Add("Sentiment", "Neutral") }
		}
		
		Write-Output $aux
	}

	if ($twitterData -and $twitterData.Count -ne 0)
	{
		# Persist the SinceId

		$sinceId = ($twitterData | Sort-Object "CreatedAt" -Descending | Select -First 1).Id
		
		Set-Content -Path $sinceIdFilePath -Value $sinceId

		# Send the data to PowerBI

		$twitterData | Out-PowerBI -dataSetName "TwitterPBIAnalysis" -tableName "Tweets" -types @{"Tweets.CreatedAt"="datetime"} -verbose
	}
	else
	{
		Write-Output "No tweets found."
	}

	Write-Output "Sleeping..."

	Sleep -Seconds 30
}