migration – dbatools https://dbatools.io/ the community's sql powershell module Fri, 13 Sep 2019 08:53:20 +0000 en-US hourly 1 https://wordpress.org/?v=5.3.4 https://dbatools.io/wp-content/uploads/2016/05/dbatools.png?fit=32%2C32&ssl=1 migration – dbatools https://dbatools.io/ 32 32 111052036 sql server migration enhancements https://dbatools.io/migration-enhancements/ https://dbatools.io/migration-enhancements/#comments Mon, 24 Sep 2018 14:30:44 +0000 https://dbatools.io/?p=8387 A while back, I added some new features to our migration commands but I forgot to blog about them. Then, I used one of the new features for a fast and successful migration, got so pumped and had to share.

Multiple destinations

Now, you can migrate from one server to many. This applies to both Start-DbaMigration and all of the Copy-Dba* commands, including Copy-DbaDatabase and Copy-DbaLogin.

As you may be able to see in the title bar of this Out-GridView, I am migrating from workstation, which is a SQL Server 2008 instance, to localhost\sql2016 and localhost\sql2017. My entire command is as follows:

Start-DbaMigration -Source workstation -Destination localhost\sql2016, localhost\sql2017 -BackupRestore -UseLastBackup | Out-GridView

If you examine the results, you’ll see it migrated sp_configure, then moved on to credentials – first for localhost\sql2016 then localhost\sql2017. And continued on from there, migrating central management server, database mail, server triggers and databases.

It migrates a bunch of other things too, of course. Haven’t seen or performed a migration before? Check out this 50-second video of a migration.

-UseLastBackup

This one is awesome. Now, you can use your existing backups to perform a migration! Such a huge time saver. Imagine the following scenario:

Dear DBA,
Your mission, should you choose to accept it, is to migrate the SQL Server instance for a small SharePoint farm.
You’ll work with the SharePoint team and will have four hours to accomplish your portion of the migration.
All combined, the databases are about 250 GB in size. Good luck, colleague!

A four hour outage window seems reasonable to me! Here is one way I could accomplish this task:

  • Update the SQL Client Alias for the SharePoint Servers
  • Shut down the SharePoint servers
  • Execute my scheduled log backup job one final time
  • Perform the migration using -UseLastBackup switch
  • Boot up the SharePoint Servers
  • Check that the sites work
  • Party 🎈

I’ve actually done this a number of times, and even made a video about it a couple years ago.

The outdated output is drastically different from today’s output but the overall approach still applies. Note that back then, -UseLastBackup did not yet exist so in the video, it performs a backup and restore, not just a restore. I should make a new video.. one day!

So basically, this is the code I’d execute to migrate from spcluster to newcluster:

# first, modify the alias then shut down the SharePoint servers
$spservers = "spweb1", "spweb2", "spapp1", "spapp2"
Remove-DbaClientAlias -ComputerName $spservers -Alias spsql1
New-DbaClientAlias -ComputerName $spservers -ServerName newcluster -Alias spsql1
Stop-Computer -ComputerName $spservers

# Once each of the servers were shut down, I'd begin my SQL Server migration.
Start-DbaMigration -Source spcluster -Destination newcluster -BackupRestore -UseLastBackup | Out-GridView

# Mission accomplished 🕵️‍♀️🕵️‍♂️

This would migrate all of my SQL Logins, with their passwords & SIDs, etc, jobs, linked servers, all that, then it’d build my last full, diff and log backups chain, and perform the restore to newcluster! Down. To. My. Last. Log. So awesome 😁👍

Thanks so much for that functionality Stuart, Oleg and Simone!

What about VLDBs?

For very large database migrations, we currently offer log shipping. Sander Stad made some cool enhancements to Invoke-DbaDbLogShipping and Invoke-DbaDbLogShipRecovery recently, too.

# Also supports multiple destinations!
# Oh, and has a ton of params, so use a PowerShell splat
 $params = @{
    Source = 'sql2008'
    Destination = 'sql2016', 'sql2017'
    Database = 'shipped'
    BackupNetworkPath= '\\backups\sql'
    PrimaryMonitorServer = 'sql2012'
    SecondaryMonitorServer = 'sql2012'
    BackupScheduleFrequencyType = 'Daily'
    BackupScheduleFrequencyInterval = 1
    CompressBackup = $true
    CopyScheduleFrequencyType = 'Daily'
    CopyScheduleFrequencyInterval = 1
    GenerateFullBackup = $true
    Force = $true
}

# pass the splat
Invoke-DbaDbLogShipping @params

# And now, failover to secondary
Invoke-DbaDbLogShipRecovery -SqlInstance localhost\sql2017 -Database shipped

Other migration options such as Classic Mirroring and Availability Groups are still on the agenda.

Happy migrating!

- Chrissy

]]>
https://dbatools.io/migration-enhancements/feed/ 11 8387
real-world tde database migrations https://dbatools.io/real-world-tde-database-migrations/ https://dbatools.io/real-world-tde-database-migrations/#comments Wed, 07 Feb 2018 14:45:06 +0000 https://dbatools.io/?p=7060 In today’s post, I will tell you about how we managed to successfully complete a migration during a ~12 hour maintenance window. This could have taken a LOT longer if didn’t have dbatools to automate several of the steps.

Although I will not go in to every detail about our process, I want to emphasize the areas were we chose to use dbatools to make our lives easier.

the goal

Recently, we got the green light for upgrading to SQL Server 2016 and we were ready to roll. Our task was to migrate multiple servers, each having several TDE encrypted databases on them. All the databases were mirrored on SQL Servers hosted in a different datacenter.

When protecting data using TDE, special care must be taken when it comes to migrations. We had two primary options for migrating TDE protected databases.

first option

One option would be decrypt the databases on the old servers prior to the migration. This can take a while depending on the database size as the process would touch every single data page on disk. Same would be true once we encrypted the databases on the new servers.

If you ever decide to take this route during your migration, make sure you follow the correct and complete process to disable TDE. Otherwise you can lock your data if you don’t have the certificates and keys backed up somewhere else, ready to be restored in case of emergency. And perhaps most importantly, make sure you test this process!

second option

The second option (and the one we chose) was to leave the encryption enabled. In order to be able to attach the files, or to do restores from the backups you need to have the same certificate that was used for encryption. This certificate is protected by the master key.

To accomplish this:

  1. Make backups of the master key and the certificates
  2. Restore the key and certificates on the new principal and mirror pairs

Be aware that each database can have its own certificate! You must be sure which database is protected by which certificate. Failing to have this sorted out will leave you with some files you cannot attach or restore anywhere. Basically, you’d lose the data 😢

Need help figuring all of this out? Check out Microsoft’s article Move a TDE Protected Database to Another SQL Server. As I mentioned before make sure you test this process ahead of time!

preparation

In preparation for the migration day, we built all the new servers (primaries and mirrors) ahead of time and configured them based on our requirements.

A key point is to restore the keys used for encryption to the new servers. From this point on you won’t have to worry too much about TDE.

Next is where dbatools comes into play:

$serverList = @(
'server_01',
'server_02',
    'server_03'
)

foreach ($server in $serverList) {
    Set-DbaSpConfigure -SqlInstance $server -ConfigName CostThresholdForParallelism -Value 50
    Set-DbaSpConfigure -SqlInstance $server -ConfigName DefaultBackupCompression -Value 1
    Set-DbaSpConfigure -SqlInstance $server -ConfigName OptimizeAdhocWorkloads -Value 1
    Set-DbaSpConfigure -SqlInstance $server -ConfigName RemoteDacConnectionsEnabled -Value 1
    Set-DbaSpConfigure -SqlInstance $server -ConfigName ShowAdvancedOptions -Value 1
    # Insert all your config options here

    Set-DbaPowerPlan -ComputerName $server
    Set-DbaDbOwner -SqlInstance $server

    # Suppress all successful backups in SQL server error log
    Enable-DbaTraceFlag -SqlInstance $server -TraceFlag 3226

    # Set max memory to the recommended MB
    Set-DbaMaxMemory -SqlInstance $server
}

Doesn’t matter how many servers you have in your environment, doing it like this saves you a lot of time and you can be sure you have the same configuration for all of them.

Next, we created our DBA toolkit database where we keep all the handy stuff:

foreach ($server in $serverList) {
    # Create DBA database
    Invoke-DbaQuery -SqlInstance $SqlInstance -Query $SQL

    # Install sp_WhoIsActive
    Install-DbaWhoIsActive -SqlInstance $SqlInstance -Database DBA
}

Couldn’t be easier than this!

moving forward

Next, we created our backup jobs:

foreach ($server in $serverList) {
    # Install Ola Hallengren's solution
    Install-DbaMaintenanceSolution -SqlInstance $SqlInstance -Database DBA -ReplaceExisting -CleanupTime 72 -LogToTable -Solution "All" -BackupLocation "X:\SQLBackup" -OutputFileDirectory "X:\SQLMaintenanceLogs" -InstallJobs
}

For auditing purposes we even saved the old server configuration and compare it to the new ones, all with only a few lines of code:

$propcompare = foreach ($prop in $oldprops) {
    [pscustomobject]@{
        Config              = $prop.DisplayName
        'SQL Server 2008R2' = $prop.RunningValue
        'SQL Server 2016'   = ($newprops | Where ConfigName -eq $prop.ConfigName).RunningValue
    }
}

Save $propcompare to a database using Write-DbaDataTable and you’re set.

Transfering the logins from the old server is now easier than ever (no more sp_help_revlogin):

Copy-DbaLogin -Source oldServer -Destination newServer

Beat that if you can! This way we ensured that old logins will work as soon as we start the applications.

We started to build the mirrors ahead of time (we have some big databases). We got the details for the last full backups on the old servers:

$OldServerList | ForEach-Object {
    Get-DbaDbBackupHistory -SqlInstance $_ -LastFull | Select SqlInstance, Database, Start, End, Duration, Path, TotalSize
} | Format-Table -AutoSize

Based on the output of the above we fired up a quick PowerShell script that got everything copied over to a network share. From there, restoring the backups to the new mirrors was simple as:

Restore-DbaDatabase -SqlInstance $newSQLServer_01 -Database db_1 -Path _\\SharedPath\Migration\db_1 -NoRecovery -WithReplace -Verbose
Restore-DbaDatabase -SqlInstance $newSQLServer_02 -Database db_2 -Path _\\SharedPath\Migration\db_2 -NoRecovery -WithReplace -Verbose 
...
Restore-DbaDatabase -SqlInstance $newSQLServer_0N -Database db_N -Path _\\SharedPath\Migration\db_N -NoRecovery -WithReplace -Verbose 

I can do this all day long, especially when piping it directly to the Backup-DbaDatabase 🙂

finalizing the migration

During the maintenance window we just got a last DIFF for each database and restored that to the new mirrors. Some manual growth on some of the databases and the restore of the DIFFs took us the longest time (those contained several days of data for each server).

To shorten the time for the principals, we used detach/attach of the data and log drives approach which went pretty smooth with no unexpected incidents.

And, in case you’re wondering, we do have valid backups. We even restore and test those automatically on a separate server using dbatools as well. These jobs take around 17 hours per day.

I must confess, we did a bit of T-SQL to bring the mirroring up and now we’re back in business, HA included.

post migration

Now to the post migration stuff.

Once we’re running on the new instances, we made sure we enable all the jobs we already created ahead of time:

$serverList = @(
    'server_01',
    'server_02',
    'server_03'
)

foreach ($server in $serverList) {
    $jobs = Get-DbaAgentJob -SqlInstance $server
    $jobs | ForEach-Object {Set-DbaAgentJob -SqlInstance $server -Job $_ -Enabled}
} 

Alternatively, piping support was added to today’s release, 0.9.191. With this release and above, you can use the following syntax:

Once we’re running on the new instances, we made sure we enable all the jobs we already created ahead of time:

# much less code 👍
$serverlist | Get-DbaAgentJob | Set-DbaAgentJob -Enabled

We decided we need some more RAM on a few of the servers and after we added it, we just executed the following to make SQL Server aware of the change.

Set-DbaMaxMemory -SqlInstance $server

Now, the moment of truth. Application started and what do you know, everything just works!

  • No orphaned logins
  • No SID mismatches for the logins
  • No error messages.

<insert happy tears here>

automation is awesome

As you can see, a lot of steps were automated using dbatools and this saved us a lot of time overall, making this multi-terabyte database migration across 5 environments very smooth.

There is still room for a lot more automation, for even more efficiency. I’m sure we’ll do better next time.

Thanks again to all of you contributing to this project and Chrissy, thanks for starting all this. Can’t wait to see what the future will bring for dbatools and I hope I’ll be able to contribute more.

Happy migrations everyone!

- Viorel 🇧🇪

]]>
https://dbatools.io/real-world-tde-database-migrations/feed/ 3 7060
scheduling a migration https://dbatools.io/scheduling-a-migration/ https://dbatools.io/scheduling-a-migration/#respond Tue, 20 Sep 2016 15:02:26 +0000 https://dbatools.io/?p=1609 Hey, Chrissy here. Recently, I had great success with scheduling a database migration and wanted to let you know how I did it in case you have a similar requirement.

My requirement comprised of copying two databases that were 30 GB in size, from one server to another, during a time that I wouldn’t be at the office. Remote work was not possible.

The Script

Here’s the scheduled-migration.ps1 script that ultimately worked for me.

Start-Transcript C:\logs\db-migration-9-1-2016.txt
Import-Module C:\scripts\dbatools\dbatools.psd1
Copy-DbaDatabase -Source sql01 -Destination sql02 -Databases WSS_Content, WSS_Content2 -BackupRestore -SharedPath \\nas\sql\migration
Stop-Transcript

What it does

First, I did not need to migrate an entire instance, so I did not use Start-DbaMigration. Instead, I used Copy-DbaDatabase which doesn’t transcribe automatically like Start-DbaMigration. Because of this, I explicitly requested a transcript.

Also note that I already copied over all of the logins, jobs, etc that I needed. If you have any dependencies, make sure those are copied over as well.

Next, I imported the module to ensure that the Copy-DbaDatabase command would be available to the Scheduled Task. Potentially, I could put this in my path and hope for the best, but I’d rather be explicit.

Finally, I stopped the transcript.

Testing and -WhatIf

Because Copy-DbaDatabase runs so many checks and supports -WhatIf, I then tested by doing the following:

  1. Added -WhatIf to the Copy-DbaDatabase command
  2. Ran scheduled-migration.ps1 from the command line, and it worked
  3. Setup a scheduled task to run once, that night at 9pm, ensuring that the migration occurred no matter if I was logged in or not.

    scheduled

  4. Then I executed the scheduled task manually, waited until it finished, then checked the transcript
  5. All the tests passed!
  6. I then removed the -WhatIf from Copy-DbaDatabase inside scheduled-migration.ps1 and went home for the night 🙂
  7. The next morning, I excitedly checked my transcript and boom, both databases had migrated as planned!

In conclusion..

You don’t need to be at the office to migrate your data. So schedule that migration. Go home to your family, your cats, and/or your video console. Or drink a craft beer and watch some Rick and Morty. You deserve it.

]]>
https://dbatools.io/scheduling-a-migration/feed/ 0 1609
another batch of new commands now available! https://dbatools.io/new-batch-of-commands/ https://dbatools.io/new-batch-of-commands/#comments Wed, 17 Aug 2016 16:27:13 +0000 https://dbatools.io/?p=1400 dbatools is not only intended to be a great migration tool, but also a toolset to help DBAs follow best practices. Check out all the new commands in this batch, courtesy of Mike Fal, Constantine Kokkinos and Chrissy LeMaire.

Best Practices Commands

Ever read a really great article about how to do something properly but then kinda forgot and you have to revisit each time? Well, we’ve codified some of those practices for you, making them easy to both follow and remember.

 

Commands that make your life easier

 

Commands that are coming soon

And we’ve got more planned!

  • Copy-SqlMaintenancePlan
    Maintenance plan support isn’t provided by dbatools at this time, but that’s soon going to change once this command is complete.

  • Disable-DbaLogonTrigger
    Like Reset-DbaAdmin, this is a command that won’t be used often, but when it is, its a lifesaver.

  • Find-DbaSqlInstance
    This one’s gonna be fun. Scan your subnet, AD or specific servers for SQL Server instances.

  • Move-DbaDatabaseFile
    We wanted to ensure this command is as fail-proof and useful as possible. It’s been tough getting progress bars to work, but we’re getting there! This command should be available in our next batch.

  • Restore-DbaBackupFromDirectory
    Routine to restore databases from directories (think the way that Ola Hallengren’s outputs his by default)

  • Remove-DbaBackupFromDisk
    Routine to remove SQL backups from disk. If you copy your backups to tape or use a third-party solution, this command will ensure that no backups are deleted until they’ve been marked as archived.

  • Test-DbaBackup
    Routine to test your backups

  • Write-SqlSpWhoIsActive
    Write the results of Show-SqlSpWhoIsActive to table!

Join us!

Some of these commands are in their infancy. Want to help make them better? Come join the coding party! We’re all hanging out on the SQL Server Community Slack in the #dbatools channel.

]]>
https://dbatools.io/new-batch-of-commands/feed/ 2 1400
new best practices commands now available https://dbatools.io/new-best-practices-commands-now-available/ https://dbatools.io/new-best-practices-commands-now-available/#respond Wed, 20 Jul 2016 14:00:13 +0000 https://dbatools.io/?p=1064 So many SQL Server and PowerShell pros have joined the dbatools team and we’re producing well designed PowerShell commands like mad!

dbatools is not only intended to be a great migration tool, but also a toolset to help DBAs follow best practices. We like to think of these commands as fully automated Wizards that are executed from the command line instead of the GUI.

Best Practices Commands

Ever read a really great article about how to do something properly but then kinda forgot and you have to revisit each time? Well, we’ve codified some those practices for you, making them easy to both follow and remember.

  1. Performs a DBCC CHECKDB
  2. Backs up the database WITH CHECKSUM
  3. Restores with VERIFY ONLY on the source
  4. Creates and Agent job to easily restore from that backup
  5. Drops the database
  6. The Agent Job restores the database
  7. Performs a DBCC CHECKDB and drops the database for a final time

     Simply amazing! You can read more about Remove-DbaDatabaseSafely on Rob’s blog.

  • Set-DbaTempdbConfig
    These commands, created by Mike Fal, sets tempdb data and log files according to best practice calcluations. Mike writes more about this function on his blog.

  • Find-DbaDbUnusedIndex
    This command, created by Aaron Nelson, will help you to find unused indexes on a database or a list of databases. It also tells how much space you can save by dropping the index. For now only supported for CLUSTERED and NONCLUSTERED indexes.

Commands that make your life easier

  • Invoke-DbaWhoIsActive
    We wrote a PowerShell command to output results of Adam Machanic’s beloved sp_WhoIsActive to a GridView (default) or DataTable. If sp_WhoIsActive is not installed in the system, it will be downloaded and installed to a database you specify with either -Database or a database you select from Show-DbaDbList.

    This is v0.1 of Invoke-DbaWhoIsActive. We’ll be working on formatting options and auto-population soon.

  • Repair-DbaDbOrphanUser
    This command, created by Cláudio Silva, helps DBAs find and repairs orphaned users in one, multiple or all databases.

  • Remove-DbaDbOrphanUser
    Another homerun written by Cláudio Silva, which removes orphaned database users with no matching logins.

  • Get-DbaDiskSpace
    Displays Disk information for all local drives on one or more servers. Returns server name, name of disk, label of disk, total size, free size and percent free. Considering a new column in the future that designates if any SQL data/log files are stored on the returned disks. What do you think?

  • Get-DbaRegServer
    Gets list of SQL Server names stored in SQL Server Central Management Server. Filtering by one or more server groups supported.

  • Show-SqlServerFileSystem
    Similar to the remote file system popup you see when browsing a remote SQL Server in SQL Server Management Studio, this command allows you to traverse the remote SQL Server’s file structure.

    Show-SqlServerFileSystem uses SQL Management Objects to browse the directories and what you see is limited to the permissions of the account running the command. This will complement the upcoming Move-SqlDatabaseFile command.

wpf

  • Show-DbaDbList
    Shows a list of databases in a GUI. Returns a simple string. Hitting cancel returns null.

dblist

Test Commands

  • Test-DbaTempdbConfig
    Test your TempDb Configuration

  • Test-DbaNetworkLatency
    This function is intended to help measure SQL Server network latency by establishing a connection and making a simple query. This is a better alternative than ping because it actually creates the connection to the SQL Server, and times not only the entire routine, but also how long the actual queries take vs how long it takes to get the results.

  • Test-DbaPath
    Use this command to determine whether the SQL Server service account can “see” a directory. This command uses master.dbo.xp_fileexist under the hood and returns $true or $false.

  • Test-DbaMigrationConstraint
    Shows if you can migrate the database(s) between the servers. When you want to migrate from a higher edition to a lower one there are some features that can’t be used. This function will validate if you have any of this features in use and will report to you. The validation will be made ONLY on on SQL Server 2008 or higher using the ‘sys.dm_db_persisted_sku_features’ DMV.

Join us!

Some of these commands are in their infancy. Want to help make them better? Come join the coding party! We’re all hanging out on the SQL Server Community Slack in the #dbatools channel.

]]>
https://dbatools.io/new-best-practices-commands-now-available/feed/ 0 1064