Recently I came across this great blog post from Dustin Ryan on how to convert a Power BI Desktop File from Import to Live:
But I’ve been using another technique that is very simple and don’t require any external tools, you just need to execute 5 simple steps in Power BI Desktop:
1. Open the PBIX with Power BI Desktop
2. Open the “Query Editor”, select all the Queries and click on “Delete”:
3. Click on “Close & Apply”:
And you will end up with something like this:
4. Now click on “Get-Data” and “Analysis Services”:
5. Write down the Analysis Services (or Azure Analysis Services) server address, Database Name and click on the option “Connect Live”:
And that’s it!
If your Analysis Services model has the same fields and measures the report will work just fine and now your PBIX is converted from “Import” to “Live”:
Do you ever got this request from a customer:
“Is it possible to filter a Power BI report with a list of 100+ items? Do I need to pick one by one on a slicer?”
We do and because of that DevScope team built a custom visual to ease that pain: “Filter by List”
This visual is very simplistic but also very powerful and very simple to use:
- Add the visual
- Select a field
- Paste/Write a collection of items
- Filter, the custom visual will filter the report page with all the items that “match”
See this video to learn more:
A few months ago I published a PowerShell module called PowerBIETL that allows you to export all your Power BI Desktop file data to a SQL Server database:
Just updated the module to support export to CSV Files:
Export-PBIDesktopToCSV -pbiDesktopWindowName "*PBI Window Name*" -outputPath ".\Output"
After running the powershell code above you will end up with a CSV file for every table on your Power BI Desktop file:
Thanks to Rui Quintino for the draft code and Josh Close for creating the amazing CSVHelper!
Ever wanted to export all your Power BI reports from a workspace and save a copy without manually going to each one and select the “Download report” option:
My team at DevScope just updated the PowerBIPS powershell module with a simple cmdlet to do just that:
Export-PBIReport –destinationFolder “<local folder>”
With an easy PowerShell script you can download all the reports from your PowerBI workspaces (ex: Backups?):
# Get the Auth Token
$authToken = Get-PBIAuthToken
# Define the Workspace you want to download the reports from - Optional, by default downloads from personal workspace
Set-PBIGroup -authToken $authToken -name "Demos - PBIFromTrenches" -Verbose
# Downloads the reports to a destination folder
Export-PBIReport -authToken $authToken -destinationFolder "C:\Temp\PBIReports" -Verbose
Recently we had the need to analyse the queries made by the users on Azure Analysis Services and to cross reference that data with Azure AS metrics. For example to see exactly which queries are the cause for high QPU’s or Memory and see who made them on which application.
Currently Azure AS allows you to configure an Extended Events session to collect events from your Analysis Services database:
But there’s no easy way to export or save that data to do some further analysis. You can only watch live data and it’s not very user friendly:
We tried to use the good old ASTrace but it’s not compatible with Azure Analysis Services and it’s not a very good practice because it basicaly create a Profiler Session that will be deprecated soon.
Because we desperately needed to analyse the user queries to identify bottlenecks my amazing BI team at DevScope build an great tool called “Azure-As-Trace” that will allow you to point to a Analysis Services Instance and instantly start collecting the events you want and store them in the file system in JSONL format.
You can download it or contribute to it at github: https://github.com/DevScope/Azure-AS-Tracer
It’s very simple to use you just need to download the binaries and change in the config file ‘AzureASTrace.exe.config’ the following parameters:
||The connection string to the Analysis Services instance you want to monitor
||The path to the XEvents trace template to create the monitoring session on the Analysis Services Instance
||The path to the Output Folder that will hold the JSONL files
After that you have two options:
- Run AzureASTracer as a console application, by simply executing AzureASTrace.exe
- Run AzureASTracer as a windows service by running ‘setup.install.bat’ and start the service
Either way when running the events will be saved on this on the Output folder, AzureASTrace will create a file for every Event Type subscribed and group the files by day:
Now you can analyze those events in Power BI (comming soon) very easily…
Big thank you to all who attended my session at #TugaIT, I had a great time:
It was a simple but very practical tip’s & trick’s talk around Power BI.
Download the slides with all the tips here: https://1drv.ms/b/s!AhYJIuS7UANdgb0M4lRWaiMLRkkVWg
Thanks and see you next year!
Did you ever faced a scenario were you needed to load a collection of CSV/Text files into SQL Server tables?
What solution did you choose?
- TSQL BULK INSERT?
- SSIS Package (generated from SSMS Tasks->Import Data or manual)
- PowerShell “Import-CSV”
And what if the SQL Server destination tables must be typed (numeric, date, text columns,…) and the CSV file has formatting issues (ex: text columns without quotes, datetimes not in ISO format) and you need to transform the columns into the desired types?
A much quicker solution to transform CSV files into the desired shape is using a PowerBI Desktop query (or PowerQuery), for example in seconds I can:
- Load the CSV
- Replace a value from all the columns (in this case “NULL” from a real null)
- Auto detect the datatypes
Now to load these queries into a SQL Server database, it’s very easy thanks to DevScope powershell module “PowerBIETL” (also available at PowerShellGallery):
Export-PBIDesktopToSQL -pbiDesktopWindowName "*sample*" -sqlConnStr "Data Source=.\SQL2014; Initial Catalog=DestinationDB; Integrated Security=SSPI" -sqlSchema "stg" -verbose
The cmdlet “Export-PBIDesktopToSQL” will take care of:
- Connects to the PBI Desktop and read the tables
- Automatically create the tables on the SQL Database (if they do not exist)
- Thanks to DevScope “SQLHelper” powershell module and “Invoke-SQLBulkCopy” cmdlet
- Bulk copy the data from PBI Desktop into the SQL Table
The cmdlet has 4 parameters:
- -PBIDesktopWindowName (mandatory)
- A wildcard to find the PowerBI Desktop window
- -Tables (optional, defaults to all the tables)
- Array of tables to import
- -SQLConnStr (mandatory)
- Connection to a SQL Server database
- -SQLSchema (optional, defaults to “dbo”)
- The schema under the tables will be created
As a result all the tables from the PBI Desktop file will get copied into the SQL Server database:
Off course this will only work to those “one-time-only” or manual scenarios, but I assure you that is much quicker than using a SQL Integration Services package