dbatools is a free PowerShell module with over 300 SQL Server administration, best practice and migration commands included.

Please note that documentation and command names may be out of date while we work furiously towards 1.0


Import-DbaCsvToSql

Import-DbaCsvToSql takes advantage of .NET’s super fast SqlBulkCopy class to import CSV files into SQL Server at up to 90,000 rows a second.

The entire import is contained within a transaction, so if a failure occurs or the script is aborted, no changes will persist.

If the table specified does not exist, it will be automatically created using best guessed data types. In addition, the destination table can be truncated prior to import.

The Query parameter will be used to import only the data returned from a SQL Query executed against the CSV file(s). This function supports a number of bulk copy options. Please see parameter list for details.

Screenshots

dbatools - Import-DbaCsvToSql

Examples

Imports the entire comma-delimited housing.csv to the SQL “markets” database on a SQL Server named sql001.
Since a table name was not specified, the table name is automatically determined from filename as “housing” and a prompt will appear to confirm table name.
The first row is not skipped, as it does not contain column names.

Import-DbaCsvToSql -Csv C:\temp\housing.csv -SqlInstance sql001 -Database markets

Imports the first 100,000 rows of the tab delimited housing.csv file to the “housing” table in the “markets” database on a SQL Server named sql001. Since -Safe was specified, the OleDB method will be used for the bulk import. The first row is skipped, as it contains column names.

Import-DbaCsvToSql -Csv .\housing.csv -SqlInstance sql001 -Database markets -Table housing -First 100000 -Safe -Delimiter “t” -FirstRowColumns

Imports all records from the pipe delimited huge.txt file using the fastest method possible into the latitudes table within the locations database. Obtains a table lock for the duration of the bulk copy operation. This specific command has been used to import over 10.5 million rows in 2 minutes.

Import-DbaCsvToSql -csv C:\temp\huge.txt -SqlInstance sqlcluster -Database locations -Table latitudes -Delimiter “|” -Turbo

Truncates the “housing” table, then imports columns 1 and 3 of the first 100000 rows of the tab-delimited housing.csv in the C:\temp directory, and housing2.csv in the current directory. Since the query is executed against both files, a total of 200,000 rows will be imported.

Import-DbaCsvToSql -Csv C:\temp\housing.csv, .\housing2.csv -SqlInstance sql001 -Database markets -Table housing -Delimiter “t” -query “select top 100000 column1, column3 from csv” -Truncate

Author

This command was created by Chrissy LeMaire. You can find Chrissy on Twitter.

Get-Help

From PowerShell, execute Get-Help Import-DbaCsvToSql -Detailed for more information on this function.

Source Code

Want to see the source code? View Import-DbaCsvToSql.ps1 on GitHub
 

Related commands