Convert a Power BI Desktop report from Import to Live Query

Recently I came across this great blog post from Dustin Ryan on how to convert a Power BI Desktop File from Import to Live:

But I’ve been using another technique that is very simple and don’t require any external tools, you just need to execute 5 simple steps in Power BI Desktop:

1. Open the PBIX with Power BI Desktop

2. Open the “Query Editor”, select all the Queries and click on “Delete”:


3. Click on “Close & Apply”:


And you will end up with something like this:


4. Now click on “Get-Data” and “Analysis Services”:


5. Write down the Analysis Services (or Azure Analysis Services) server address, Database Name and click on the option “Connect Live”:


And that’s it!

If your Analysis Services model has the same fields and measures the report will work just fine and now your PBIX is converted from “Import” to “Live”:



Power BI Custom Visual–Filter by List

Do you ever got this request from a customer:

“Is it possible to filter a Power BI report with a list of 100+ items? Do I need to pick one by one on a slicer?”

We do and because of that DevScope team built a custom visual to ease that pain: “Filter by List


This visual is very simplistic but also very powerful and very simple to use:

  • Add the visual
  • Select a field
  • Paste/Write a collection of items
  • Filter, the custom visual will filter the report page with all the items that “match”

See this video to learn more:

Export Power BI Desktop data to CSV Files

A few months ago I published a PowerShell module called PowerBIETL that allows you to export all your Power BI Desktop file data to a SQL Server database:

Just updated the module to support export to CSV Files:

Install-Module PowerBIETL

Import-Module PowerBIETL

Export-PBIDesktopToCSV -pbiDesktopWindowName "*PBI Window Name*" -outputPath ".\Output"

After running the powershell code above you will end up with a CSV file for every table on your Power BI Desktop file:


Thanks to Rui Quintino for the draft code and Josh Close for creating the amazing CSVHelper!


Automatically Export PBIX’s using #PowerBIPS

Ever wanted to export all your Power BI reports from a workspace and save a copy without manually going to each one and select the “Download report” option:


My team at DevScope just updated the PowerBIPS powershell module with a simple cmdlet to do just that:

Export-PBIReport –destinationFolder “<local folder>”

With an easy PowerShell script you can download all the reports from your PowerBI workspaces (ex: Backups?):

Install-Module PowerBIPS

# Get the Auth Token

$authToken = Get-PBIAuthToken 

# Define the Workspace you want to download the reports from - Optional, by default downloads from personal workspace

Set-PBIGroup -authToken $authToken -name "Demos - PBIFromTrenches" -Verbose

# Downloads the reports to a destination folder

Export-PBIReport -authToken $authToken -destinationFolder "C:\Temp\PBIReports" -Verbose


Azure Analysis Services Tracer (aka AzureASTrace.exe)

Recently we had the need to analyse the queries made by the users on Azure Analysis Services and to cross reference that data with Azure AS metrics. For example to see exactly which queries are the cause for high QPU’s or Memory and see who made them on which application.

Currently Azure AS allows you to configure an Extended Events session to collect events from your Analysis Services database:


But there’s no easy way to export or save that data to do some further analysis. You can only watch live data and it’s not very user friendly:


We tried to use the good old ASTrace but it’s not compatible with Azure Analysis Services and it’s not a very good practice because it basicaly create a Profiler Session that will be deprecated soon.

Because we desperately needed to analyse the user queries to identify bottlenecks my amazing BI team at DevScope build an great tool called “Azure-As-Trace” that will allow you to point to a Analysis Services Instance and instantly start collecting the events you want and store them in the file system in JSONL format.

You can download it or contribute to it at github:

It’s very simple to use you just need to download the binaries and change in the config file ‘AzureASTrace.exe.config’ the following parameters:

ConnectionStrings/AnalysisServices The connection string to the Analysis Services instance you want to monitor
AppSettings/XEventTemplateFilePath The path to the XEvents trace template to create the monitoring session on the Analysis Services Instance
AppSettings/OutputFolder The path to the Output Folder that will hold the JSONL files


After that you have two options:

  • Run AzureASTracer as a console application, by simply executing AzureASTrace.exe


  • Run AzureASTracer as a windows service by running ‘setup.install.bat’ and start the service


Either way when running the events will be saved on this on the Output folder, AzureASTrace will create a file for every Event Type subscribed and group the files by day:


Now you can analyze those events in Power BI (comming soon) very easily…


Slides of “#PowerBI from the Trenches” at #TugaIT


Big thank you to all who attended my session at #TugaIT, I had a great time:


It was a simple but very practical tip’s & trick’s talk around Power BI.

Download the slides with all the tips here:!AhYJIuS7UANdgb0M4lRWaiMLRkkVWg

Thanks and see you next year!


Automatically pause/resume and scale up/down Azure Analysis Services using AzureRM.AnalysisServices

One of the big advantages of Azure Analysis Services is the ability to pause/resume and scale up/down as needed, this will allow you to pay only for what you use and greatly reduce costs.

Azure Analysis Services team released a PowerShell module “AzureRM.AnalysisServices” with cmdlets to manage your Azure Analysis Services resources and they could be more easy to use:

  • Get-AzureRmAnalysisServicesServer – To get your server metadata and current status
  • Suspend-AzureRmAnalysisServicesServer – To suspend a server
  • Resume-AzureRmAnalysisServicesServer – To Resume a server
  • Set-AzureRmAnalysisServicesServer – To update a server, ex: change the SKU

More details here.

But this is only effective if we somehow automate this operation, it’s not feasible if someone on the team or customer is actively pausing/resuming or scaling up/down the instance

With that in mind we build a very simple PowerShell script where you configure in which time and days the Azure AS should be on and on which SKU.

Download the full script here.

The script is configured by a JSON metadata:


The above metadata will configure Azure AS to:

Days Hours SKU
Mon-Fri 8 AM to 18 PM (peak hours) S1
Mon-Fri 18 PM to 00 AM (off peak) S0
Sat-Sun 8 AM to 00 AM S0
Other Other Pause

The powershell script has the following parameters:

-resourceGroupName The name of the Azure Resource Group your Azure AS server is deployed:


-serverName The name of the Azure AS Server:


-configStr The JSON metadata config string


The path to an Azure profile stored locally using the “Save-AzureRmContext” cmdlet.

This is useful to test the script locally.



The name of Azure Connection if you want to run this script in a Azure Automation RunBook:


Probably most of you will want to run this script on an PowerShell Runbook in Azure Automation, learn how to setup the Azure Automation here.

Next you will need to register the module “AzureRM.AnalysisServices” as an Asset of your automation account:


After that just create a new PowerShell runbook:


Paste the script and remember to set the parameter -azureRunAsConnectionName:


Publish the script and create a schedule:



That’s it! You know have your Azure AS automatically pausing/resuming and scalling up/down using a configuration file you defined.

Now just lay back and measure the savings at the end of the month!