Jess Pomfret – dbatools https://dbatools.io/ the community's sql powershell module Fri, 13 Sep 2019 08:41:44 +0000 en-US hourly 1 https://wordpress.org/?v=5.3.4 https://dbatools.io/wp-content/uploads/2016/05/dbatools.png?fit=32%2C32&ssl=1 Jess Pomfret – dbatools https://dbatools.io/ 32 32 111052036 managing data compression with dbatools https://dbatools.io/compression/ https://dbatools.io/compression/#comments Wed, 02 Jan 2019 15:37:14 +0000 https://dbatools.io/?p=9319 Data compression is not a new feature in SQL Server. In fact it has been around since SQL Server 2008, so why does it matter now? Before SQL Server2016 SP1 this feature was only available in Enterprise edition. Now that it’s in Standard edition data compression can be an option for far more people.

dbatools has three functions available to help you work with data compression, and in true dbatools style it makes it easy and fast to compress your databases.

Get-DbaDbCompression

This one is pretty straightforward, it shows you your current compression levels across one or more SQL Servers. You can either view all objects or narrow it down to a specific database, as shown below.

Get-DbaDbCompression -SqlInstance Server1 -Database AdventureWorks2017 | 
Select-Object Database, Schema, TableName, IndexName, IndexType, Partition, DataCompression

Test-DbaDbCompression

Now this is where the magic happens. This function takes the leg work out of deciding whether compression is a good fit for your database. When you look to implement data compression you have two options (as far as rowstore compression goes), row or page compression. Page compression gives you superior space savings but it comes with more CPU overhead.

inefficient data types and repeated data

When you start to analyze your database to make this decision you first need to look at your table structures. Do you have a lot of fixed length datatypes that aren’t being fully utilized? Think bigint storing the number 1 or char(1000) storing ‘Jess’ – then row compression could be a good fit. Do you have a lot of repeating data, like State or Gender columns? Then page compression could do wonders for you.

workload and I/O

Secondly, and perhaps more importantly, is your workload. As previously mentioned there is a CPU overhead associated with querying compressed data, so if you are doing a lot of seeks and/or updates the benefits might be outweighed by the costs. On the other hand if you do a lot of scans one of the benefits of data compression, more data stored per page, will greatly reduce your I/O costs and improve your performance overall.

This is a lot to think about for each object in each of your databases. Worry not friends! The SQL Server Tiger Team created a script (available on their github) that will analyze both your table structures and your workload. This makes up the logic within Test-DbaDbCompression.

compression in action

You can see below I’ve analyzed the entire AdventureWorks2017 database and saved the results to a variable. This makes it easy to work through the output, looking at certain objects/indexes of interest.

$results = Test-DbaDbCompression -SqlInstance Server1 -Database AdventureWorks2017
$results | Where-Object TableName -eq 'SalesOrderDetail' |
Select-Object TableName, IndexName, IndexId, PercentScan, PercentUpdate, RowEstimatePercentOriginal, PageEstimatePercentOriginal, CompressionTypeRecommendation, SizeCurrent, SizeRequested, PercentCompression | Format-Table


This database is actually running in a container on my laptop so there isn’t much activity, but when you use this command the PercentScan and PercentUpdate will be determined by your workload so the longer your instance has been up the more accurate these will be.

I’ve selected to look at the SalesOrderDetail table in the above example. You can see the function suggests we apply page compression to our primary key (IndexId of 1) and one of our Nonclustered indexes (IndexId of 3).

Set-DbaDbCompression

The final compression function is used to apply compression to our objects. You can choose to apply row or page compression to your entire database, which could be useful to save space in your development or test environments.

Set-DbaDbCompression -SqlInstance Server1 -Database AdventureWorks2017 -CompressionType Page

More useful however is to once again use the Tiger Team script to apply the recommended compression to your objects.

Running the following one line will first analyze your database using the same logic we discussed above, and then apply the suggested levels to each index and table within your database.

Set-DbaDbCompression -SqlInstance Server1 -Database AdventureWorks2017 -CompressionType Recommended

additional options

There are also some other options available to control this behavior. You can use the -PercentCompression parameter so objects will only be compressed if the calculated savings are greater than the specified percentage. You also can control the amount of time this command runs for. If you set -MaxRunTime to 60 it will finish the current compression command and then stop.

Set-DbaDbCompression -SqlInstance Server1 -Database AdventureWorks2017 -CompressionType Recommended -PercentCompression 25 -MaxRunTime 60

conclusion

One final idea I’ll leave you with- earlier I ran Test-DbaDbCompression and saved the output to a variable. I did this because I like to be able to see what the suggestions are and can also save this output for a later date if needed. Once I’m happy and ready to run Set-DbaDbCompression I don’t want to wait for the analysis to happen again. I can instead pass in the output I saved using the -InputObject and Set-DbaDbCompression will work through those suggestions applying the recommended compression levels immediately.

Set-DbaDbCompression -SqlInstance Server1 -InputObject $results

Data compression can be a powerful tool in your DBA toolbelt for saving space and performance tuning. You can apply this with minimal effort and no application code changes. I’d recommend playing with the Test-DbaDbCompression function and see if you can’t squeeze out some easy gains.

Cheers,

Jess

]]>
https://dbatools.io/compression/feed/ 3 9319
migrating application databases with dbatools https://dbatools.io/migrating-application-dbs/ https://dbatools.io/migrating-application-dbs/#comments Thu, 15 Mar 2018 14:22:01 +0000 https://dbatools.io/?p=7289 I’ve been working on a project this year to upgrade SQL Server versions for around 80 application databases, with most of the upgrades requiring both SQL Server and Windows Server upgrades to get to the future state we were looking for. The general process for each of these was to build a new virtual machine with the upgraded operating system, install the desired SQL Server version and then migrate the application databases during an arranged downtime window.

I’m going to focus on the final step of this process for this post – migrating the databases during the downtime windows. Luckily for me, dbatools made this both easy and repeatable.

Step 1 – Check for connections

First step when we get into the downtime window is to check whether there are any active connections to the database you want to migrate. We don’t want any data being changed while we migrate, there’s a command for that:

Get-DbaProcess -SqlInstance SourceServer -Database MigratingDatabase | 
Select Host, login, Program

If there are connections and it’s safe to remove them (if they are still coming from the application it might be worth talking to the app owners first) you can pipe them to another handy dbatools command:

Get-DbaProcess -SqlInstance SourceServer -Database MigratingDatabase | 
Stop-DbaProcess

Step 2 – Migrate the database

Now that there are no connections we can move the database.  Depending on the situation it might be worth setting the database to read only or single user mode first. In my case, I had the application taken down so I felt confident no connections would be coming in.

With one line of code we can select the source and destination servers, the database name, specify that we want to use the backup and restore method, and then provide the path to a file share that both instance service accounts have access to:

Copy-DbaDatabase -Source SourceServer -Destination DestinationServer -Database MigratingDatabase -BackupRestore -SharedPath \\fileshare\

There are a lot more options available on this command, including setting the number of backup files to use, which can speed things up if you have a large database. I recommend checking out the command based help for all the available options.

Step 3 – Migrate the user logins

Once the database is on the new server we can use the following to copy the associated logins across. The nice thing about using this command is it ensures the user SIDs match up on the destination and you don’t end up with any orphan SQL Logins.

Copy-DbaLogin -Source SourceServer -Destination DestinationServer -Login AppReadOnly, AppReadWrite, DOMAIN\AppUser

Step 4 – Set the source database offline

Now that the database and associated logins have been migrated we can set the source database offline. I did this so if there were any issues getting the application up we could quickly revert back while ensuring nothing was still accessing the old copy.

Set-DbaDbState -SqlInstance SourceServer -Database MigratingDatabase -Offline -Force

In the end I was able to use 5 lines of PowerShell to get these application databases migrated to their new homes. After some testing I dropped the old offline copy of the database and eventually decommissioned the old servers.

I hope this gives you some ideas of how dbatools can help make your database migrations easier and more efficient.

Jess 🇬🇧

]]>
https://dbatools.io/migrating-application-dbs/feed/ 9 7289