Did you ever faced a scenario were you needed to load a collection of CSV/Text files into SQL Server tables?
What solution did you choose?
- TSQL BULK INSERT?
- SSIS Package (generated from SSMS Tasks->Import Data or manual)
- PowerShell “Import-CSV”
And what if the SQL Server destination tables must be typed (numeric, date, text columns,…) and the CSV file has formatting issues (ex: text columns without quotes, datetimes not in ISO format) and you need to transform the columns into the desired types?
A much quicker solution to transform CSV files into the desired shape is using a PowerBI Desktop query (or PowerQuery), for example in seconds I can:
- Load the CSV
- Replace a value from all the columns (in this case “NULL” from a real null)
- Auto detect the datatypes
Install-Module PowerBIETL Import-Module PowerBIETL Export-PBIDesktopToSQL -pbiDesktopWindowName "*sample*" -sqlConnStr "Data Source=.\SQL2014; Initial Catalog=DestinationDB; Integrated Security=SSPI" -sqlSchema "stg" -verbose
The cmdlet “Export-PBIDesktopToSQL” will take care of:
- Connects to the PBI Desktop and read the tables
- Automatically create the tables on the SQL Database (if they do not exist)
- Thanks to DevScope “SQLHelper” powershell module and “Invoke-SQLBulkCopy” cmdlet
- Bulk copy the data from PBI Desktop into the SQL Table
The cmdlet has 4 parameters:
- -PBIDesktopWindowName (mandatory)
- A wildcard to find the PowerBI Desktop window
- -Tables (optional, defaults to all the tables)
- Array of tables to import
- -SQLConnStr (mandatory)
- Connection to a SQL Server database
- -SQLSchema (optional, defaults to “dbo”)
- The schema under the tables will be created
As a result all the tables from the PBI Desktop file will get copied into the SQL Server database:
Off course this will only work to those “one-time-only” or manual scenarios, but I assure you that is much quicker than using a SQL Integration Services package