party – dbatools https://dbatools.io/ the community's sql powershell module Fri, 13 Sep 2019 08:53:20 +0000 en-US hourly 1 https://wordpress.org/?v=5.3.4 https://dbatools.io/wp-content/uploads/2016/05/dbatools.png?fit=32%2C32&ssl=1 party – dbatools https://dbatools.io/ 32 32 111052036 dbatools 1.0 has arrived πŸŽ‰ https://dbatools.io/dbatools10/ https://dbatools.io/dbatools10/#comments Tue, 18 Jun 2019 20:25:48 +0000 https://dbatools.io/?p=9690 We are so super excited to announce that after 5 long years, dbatools 1.0 is publicly available!

Our team had some lofty goals and met a vast majority of them πŸ†. In the end, my personal goal for dbatools 1.0 was to have a tool that is not only useful and fun to use but trusted and stable as well. Mission accomplished: over the years, hundreds of thousands of people have used dbatools and dbatools is even recommended by Microsoft.

Before we get started with what’s new, let’s take a look at some history.

historical milestones

dbatools began in July of 2014 when I was tasked with migrating a SQL Server instance that supported SharePoint. No way did I want to do that by hand! Since then, the module has grown into a full-fledged data platform solution.

  • 07/2014 – Started
  • 07/2014 – Published to GitHub & ScriptCenter
  • 06/2016 – First major contributors
  • 01/2017 – Road to 1.0 began
  • 03/2018 – Switch from GPL to MIT
  • 05/2019 – Added MFA Support
  • 06/2019 – Over 160 contributors and 550 commands

Thanks so much to every single person who has volunteered any time to dbatools. You’ve helped change the SQL Server landscape.

improvements

We’ve made a ton of enhancements that we haven’t had time to share even over the past six months. Here are a few.

availability groups

Availability Group support has been solidified and is looking good and New-DbaAvailabilityGroup is better than ever. Try out the changes and let us know how you like them.

Get-Help New-DbaAvailabilityGroup -Examples

authentication support

We now also support all the different ways to login to SQL Server! So basically this:

Want to try it for yourself? Here are a few examples.

# AAD Integrated Auth
Connect-DbaInstance -SqlInstance psdbatools.database.windows.net -Database dbatools

# AAD Username and Pass
Connect-DbaInstance -SqlInstance psdbatools.database.windows.net -SqlCredential [email protected] -Database dbatools

# Managed Identity in Azure VM w/ older versions of .NET
Connect-DbaInstance -SqlInstance psdbatools.database.windows.net -Database abc -SqCredential appid -Tenant tenantguidorname

# Managed Identity in Azure VM w/ newer versions of .NET (way faster!)
Connect-DbaInstance -SqlInstance psdbatools.database.windows.net -Database abc -AuthenticationType 'AD Universal with MFA Support'

You can also find a couple more within the MFA Pull Request on GitHub and by using Get-Help Connect-DbaInstance -Examples.

registered servers

This is probably my favorite! We now support Local Server Groups and Azure Data Studio groups. Supporting Local Server Groups means that it’s now a whole lot easier to manage servers that don’t use Windows Authentication.

Here’s how you can add a local docker instance.

# First add it with your credentials
Connect-DbaInstance -SqlInstance 'dockersql1,14333' -SqlCredential sqladmin | Add-DbaRegServer -Name mydocker

# Then just use it for all of your other commands.
Get-DbaRegisteredServer -Name mydocker | Get-DbaDatabase

Totally dreamy 😍

csv

Import-DbaCsv is now far more reliable. While the previous implementation was faster, it didn’t work a lot of the time. The new command should suit your needs well.

Get-ChildItem C:\allmycsvs | Import-DbaCsv -SqlInstance sql2017 -Database tempdb -AutoCreateTable

future & backwards compatible

In the past couple months, we’ve started focusing a bit more on Azure: both Azure SQL Database and Managed Instances. In particular, we now support migrations to Azure Managed Instances! We’ve also added a couple more commands to PowerShell Core., in particular, the Masking and Data Generation commands. Over 75% of our commands run on mac OS and Linux!

Still, we support PowerShell 3 and Windows 7 and SQL Server 2000 when we can. Our final testing routines included ensuring support for:

  • Windows 7
  • SQL Server 2000-2019
  • User imports vs Developer imports
  • mac OS / Linux
  • x86 and x64
  • Strict (AllSigned) Execution Policy

new commands

We’ve also added a bunch of new commands, mostly revolving around Roles, PII, Masking, Data Generation and even ADS notebooks!

Want to see the full list? Check out our freshly updated Command Index page 🀩.

configuration enhancements

A few configuration enhancements have been made and a blog post for our configuration system is long overdue. But one of the most useful, I think, is that you can now control the client name. This is the name that shows up in logs, in Profiler and in Xevents.

# Set it
Set-DbatoolsConfig -FullName sql.connection.clientname -Value "my custom module built on top of dbatools" -Register

# Double check it
Get-DbatoolsConfig -FullName sql.connection.clientname | Select Value, Description

The -Register parameter is basically a shortcut for piping to Register-DbatoolsConfig. This writes the value to the registry, otherwise, it’ll be effective only for your current session.

Another configuration enhancement helps with standardization. Now, all export commands will default to Documents\DbatoolsExport. You can change it by issuing the following commands.

# Set it
Set-DbatoolsConfig -FullName path.dbatoolsexport -Value "C:\temp\exports" -Register

# Double check it
Get-DbatoolsConfig -FullName path.dbatoolsexport | Select Value, Description

help is separated

Something new that I like because it’s “proper” PowerShell: we’re now publishing our module with Help separated into its own file. We’re using a super cool module called HelpOut. HelpOut was created for dbatools by a former member of the PowerShell team, James Brundage.

HelpOut allows our developers to keep writing Help within the functions themselves, then separates the Help into dbatools-help.xml and the commands into allcommands.ps1, which helps with faster loading. Here’s how we do it:

Install-Maml -FunctionRoot functions, internal\functions -Module dbatools -Compact -NoVersion

It’s as simple as that! This does all of the heavy lifting: making the maml file and placing it in the proper location, and parsing the functions for allcommands.ps1!

Help will continue to be published to docs.dbatools.io and updated with each release. You can read more about HelpOut on GitHub.

breaking changes

We’ve got a number of breaking changes included in 1.0.

Before diving into this section, I want to emphasize that we have a command to handle a large majority of the renames! Invoke-DbatoolsRenameHelper will parse your scripts and replace script names and some parameters for you.

command renames

Renames in the past 30 days were mostly changing Instance to Server. But we also made some command names more accurate:

Test-DbaDbVirtualLogFile -> Measure-DbaDbVirtualLogFile
Uninstall-DbaWatchUpdate -> Uninstall-DbatoolsWatchUpdate
Watch-DbaUpdate -> Watch-DbatoolsUpdate

command removal

Export-DbaAvailabilityGroup has been removed entirely. The same functionality can now be found using Get-DbaAvailabiltyGroup | Export-DbaScript.

alias removals

All but 5 command aliases have been removed. Here are the ones that are still around:

Get-DbaRegisteredServer -> Get-DbaRegServer
Attach-DbaDatabase -> Mount-DbaDatabsae
Detach-DbaDatabase – Dismount-DbaDatabase
Start-SqlMigration -> Start-DbaMigration
Write-DbaDataTable -> Write-DbaDbTableData

I kept Start-SqlMigration because that’s where it all started, and the rest are easier to remember.

Also, all ServerInstance and SqlServer aliases have been removed. You must now use SqlInstance. For a full list of what Invoke-DbatoolsRenameHelper renames/replaces, check out the source code.

parameter standardization

Most of the commands now follow the following practices we’ve observed in Microsoft’s PowerShell modules.

  • Piped input is -InputObject and not DatabaseCollection or LoginCollection, etc.
  • Directory (and some file) paths are now -Path and not BackupLocation or FileLocation
  • When a distinction is required, file paths are now -FilePath, and not RemoteFile or BackupFileName
  • If both file and directory path needs to be distinguished, Path is used for directory and FilePath for file locations

parameter removal

-SyncOnly is no longer an option in Copy-DbaLogin. Please use Sync-DbaLoginPermission instead.

-CheckForSql is no longer an option in Get-DbaDiskSpace. Perhaps the functionality can be made into a new command which can be piped into Get-DbaDiskSpace but the implementation we had was πŸ‘Ž.

For a full list of breaking changes, you can browse our gorgeous changelog, maintained by Andy Levy.

book party!

In case you did not hear the news, Rob Sewell and I, are currently in the process of writing dbatools in a Months of Lunches! We’ve really excited and hope to have a MEAP (Manning Early Access Program) available sometime in July. We will keep everyone updated here and on Twitter.

The above is what the editor looks like – a lot like markdown!

If you’d like to see what the writing process is like, I did a livestream a couple of months back while writing Chapter 6, which is about Find-DbaInstance. Sorry about the music being a bit loud, that has been fixed in future streams which can be found at youtube.com/dbatools.

sponsorship

Since Microsoft acquired GitHub, they’ve been rolling out some really incredible features. One such feature is Developer Sponsorships, which allows you to sponsor developers with cash subscriptions. It’s sorta like Patreon where you can pay monthly sponsorships with different tiers. If you or your company has benefitted from dbatools, consider sponsoring one or more of our developers.

Currently, GitHub has approved four of our team members to be sponsored including me, Shawn Melton, Stuart Moore and Sander Stad.

We’ve invited other dbatools developers to sign up as well πŸ€—

Oh, and for the first year, GitHub will match sponsorship funds! So giving to us now is like giving double.

big ol thanks

I’d like to give an extra special thanks to the contributors who helped get dbatools across the finish line these past couple months: Simone Bizzotto, Joshua Corrick, Patrick Flynn, Sander Stad, ClΓ‘udio Silva, Shawn Melton, Garry Bargsley, Andy Levy, George Palacios, Friedrich Weinmann, Jess Pomfret, Gareth N, Ben Miller, Shawn Tunney, Stuart Moore, Mike Petrak, Bob Pusateri, Brian Scholer, John G “Shoe” Hohengarten, Kirill Kravtsov, James Brundage, HΓΌseyin Demir, Gianluca Sartori and Rob Sewell.

Without you all, 1.0 would be delayed for another 5 years.

blog party!

Want to know more about dbatools? Check out some of these posts ☺

dbatools 1.0 – the tools to break down the barriers – Shane O’Neill
dbatools 1.0 is here and why you should care – Ben Miller
dbatools 1.0 and beyond – Joshua Corrick
dbatools 1.0 – Dusty R
Your DBA Toolbox Just Got a Refresh – dbatools v1.0 is Officially Available!!! – Garry Bargsley
dbatools v1.0? It’s available – Check it out!
updating sql server instances using dbatools 1.0 – Gareth N

livestreaming

We’re premiering dbatools 1.0 at DataGrillen in Lingen, Germany today and will be livestreaming on Twitch.

Thank you, everyone, for your support along the way. We all hope you enjoy dbatools 1.0

πŸ’Œ,
Chrissy

]]>
https://dbatools.io/dbatools10/feed/ 11 9690
dbatools and docker https://dbatools.io/docker https://dbatools.io/docker#comments Tue, 08 Jan 2019 22:24:40 +0000 https://dbatools.io/?p=9443
TSQL2SDAY-150x150
Today’s article is part of T-SQL Tuesday. T-SQL Tuesday is the brainchild of Adam Machanic. It is a blog party on the second Tuesday of each month. Everyone is welcome to participate.

This month’s T-SQL Tuesday, hosted by dbatools Major Contributor, Garry Bargsley ([b]|[t]), is all about automation.

Docker Hub

In his invitation, Garry asks “what automation are you proud of completing?” My answer is that I finally created a couple dbatools images and made them available on Docker Hub.

Docker Hub is a cloud-based repository in which Docker users and partners create, test, store and distribute container images.

I’ve long wanted to do this to help dbatools users easily create a non-production environment to test commands and safely explore our toolset. I finally made it a priority because I needed to ensure some Availability Group commands I was creating worked on Docker, too, and having some clean images permanently available was required. Also, in general, Docker is a just a good thing to know for both automation and career opportunities 😁

Getting started

First, install Docker.

Then grab two images from the dbatools repo. Note that these are Linux images.

# get the base images
docker pull dbatools/sqlinstance
docker pull dbatools/sqlinstance2

The first image will take a bit to download, but the second one will be faster because it’s based on the first! Neat.

The first instance is stacked with a bunch of objects, and the second one has a few basics to enable Availability Groups. Both dbatools images are based off of Microsoft’s SQL Server 2017 docker image.

I also added the following to make test migrations more interesting and Availability Groups possible:

  • Databases
  • Logins
  • Jobs
  • Endpoints
  • Server Roles
  • And more

Here’s a visible sampling:

Nice and familiar! You may also notice that sa is disabled. Within the image, I disabled the sa account and created another account with sysadmin called sqladmin. The password, as noted below, is dbatools.IO

Creating containers

To setup the containers, just copy and paste the commands below. The first one sets up a shared network and the second one sets up the SQL Servers and exposes the required database engine and endpoint ports. It also names them dockersql1 and dockersql2 and gives them a hostname with the same name. I left in “docker” so that it doesn’t conflict with any potential servers named sql1 on the network.

# create a shared network
docker network create localnet

# setup two containers and expose ports
docker run -p 1433:1433 -p 5022:5022 --network localnet --hostname dockersql1 --name dockersql1 -d dbatools/sqlinstance
docker run -p 14333:1433 -p 5023:5023  --network localnet --hostname dockersql2 --name dockersql2 -d dbatools/sqlinstance2

Generally, you don’t have to map the ports to exactly what they are running locally, but Availability Groups do a bit of port detection that require one-to-one mapping.

By the way, if you sometimes prefer a GUI like I do, check out Kitematic. It’s not ultra-useful but it’ll do.

Time to play πŸŽ‰

Now we are setup to test commands against your two containers! You can login via SQL Server Management Studio or Azure Data Studio if you’d like to take a look first. The server name is localhost (or localhost,14333 for the second instance), the username is sqladmin and the password is dbatools.IO

Note that Windows-based commands (and commands relating to SQL Configuration Manager) will not work because the image is based on SQL Server for Linux. If you’d like to test Windows-based commands such as Get-DbaDiskSpace, consider testing them on localhost if you’re running Windows.

Set up an Availability Group

Next, we’ll setup a sample availability groups. Note that since it’s referring to “localhost”, you’ll want to execute this on the computer running Docker. If you’d like to run Docker on one machine and execute the code on another machine, that is possible but out of scope for this post.

# the password is dbatools.IO
$cred = Get-Credential -UserName sqladmin

# setup a powershell splat
$params = @{
    Primary = "localhost"
    PrimarySqlCredential = $cred
    Secondary = "localhost:14333"
    SecondarySqlCredential = $cred
    Name = "test-ag"
    Database = "pubs"
    ClusterType = "None"
    SeedingMode = "Automatic"
    FailoverMode = "Manual"
    Confirm = $false
 }

# execute the command
 New-DbaAvailabilityGroup @params

PowerShell output

SQL Server Management Studio

Beautiful 😍!

Performing an export

Again, from the machine running the Docker containers, run the code below. You may note that linked servers, credentials and central management server are excluded from the export. This is because they aren’t currently supported for various Windows-centric reasons.

# the password is dbatools.IO
$cred = Get-Credential -UserName sqladmin

# First, backup the databases because backup/restore t-sql is what's exported
Backup-DbaDatabase -SqlInstance localhost:1433 -SqlCredential $cred -BackupDirectory /tmp

# Next, perform export (not currently supported on Core)
Export-DbaInstance -SqlInstance localhost:1433 -SqlCredential $cred -Exclude LinkedServers, Credentials, CentralManagementServer, BackupDevices

Whaaaat! Now imagine doing this for all of your servers in your entire estate. Want to know more? Check out simplifying disaster recovery using dbatools which covers this topic in-depth.

Performing a migration

This command requires a shared directory. Check out Shared Drives and Configuring Docker for Windows Volumes for more information. You may notice that this command does not support linked servers, credentials, central management server or backup devices.

# the password is dbatools.IO
$cred = Get-Credential -UserName sqladmin

# perform the migration from one container to another
Start-DbaMigration -Source localhost:1433 -Destination localhost:14333 -SourceSqlCredential $cred -DestinationSqlCredential $cred -BackupRestore -SharedPath /sharedpath -Exclude LinkedServers, Credentials, CentralManagementServer, BackupDevices -Force

Cleaning up

To stop and remove a container (and start over if you’d like! I do tons of times per day), run the following commands or use Kitematic’s GUI. This does not delete the actual images, just their resulting containers.

docker stop dockersql1 dockersql2
docker rm dockersql1 dockersql2

Resources

If you’d like to know more, the posts below are fantastic resources.

If you’d like to understand how containers work with the CI/CD process, check out this video by Eric Kang, Senior Product Manager for SQL Server.

Thanks for reading! Sorry about any typos or mistakes, I hastily wrote this while traveling back from vacation; I had to make Garry’s T-SQL Tuesday!

- Chrissy

]]>
https://dbatools.io/dockerfeed/ 5 9443
dbatools extension for visual studio code https://dbatools.io/dbatools-extension-for-visual-studio-code/ https://dbatools.io/dbatools-extension-for-visual-studio-code/#comments Wed, 28 Nov 2018 11:46:02 +0000 https://dbatools.io/?p=9167 We recently released a VS Code extension that lets you highlight terms and search dbatools.io, Microsoft Docs, Google, StackOverflow, DuckDuckGo, Technet or Thwack right from your code! It’s called search from code and you can find it in the Extension Marketplace.

options and settings

By default, only Google, docs and dbatools are enabled but you can configure whichever providers you’d like in VS Code Settings.

At first, I was just messing around to see what it took to create a VS Code extension, but then I realized that I was actually using it and decided to share. If you use VS Code, give it a shot and let me know if you’d like any features or additional search providers!

- Chrissy

P.S. Can’t see the gif? Watch the video on YouTube.

]]>
https://dbatools.io/dbatools-extension-for-visual-studio-code/feed/ 1 9167
more 1.0 progress https://dbatools.io/more-1-0-progress/ https://dbatools.io/more-1-0-progress/#comments Thu, 22 Nov 2018 03:39:42 +0000 https://dbatools.io/?p=9129 We’ve made even more progress in the past week! Here are some highlights of 0.9.520.

Non-breaking changes

Aliases have been added for the changes, so these are not breaking changes:

  • Mismatched Copy commands have been renamed to match their corresponding Get command names (ie. Copy-DbaCentralManagementServer is now Copy-DbaCmsRegServer).
  • Most parameters named Password have been changed to SecurePassword. They’ve always been a SecureString data type but this makes that clear.
  • The parameters ExcludeAllSystemDb and ExcludeAllUserDb have been changed to ExcludeSystem and ExcludeUser, respectively.

Reset-DbaAdmin

Reset-DbaAdmin actually has output now! This was a big oversight for an otherwise incredibly useful and cool command.

image

image

I also added a -SqlCredential parameter to make passing secure passwords easier while still staying secure.

Connect-DbaInstance

The Credential parameter in Connect-DbaInstance has been changed to SqlCredential and an alias to Credential has been added. Also, we fixed trusted domain support in both our internal and external Connect commands. So if you’ve ever had a problem with that before, it should work now.

Breaking changes

Aliases have not been created for commands using these parameters so these are breaking changes.

  • Parameters and output columns containing MB have been changed to the parameter or column name without MB. For instance SizeMB -> Size. Corresponding documentation and examples have been updated as well.
  • Parameters such as NoSystemLogins have been changed to ExcludeSystemLogins. The basic rule I followed when determining what would change was keep No for Verbs and Exclude for nouns: NoVerb, ExcludeNoun.

Invoke-DbatoolsRenameHelper has been updated to handle the No to Exclude changes, so don’t forget you can auto-update your scripts. Running it is as simple as:

Get-ChildItem *.ps1 -Recurse | Invoke-DbatoolsRenameHelper

Note that it does not work for the breaking changes released below (they are massive) or any of the Exclude parameters in Start-DbaMigration since the change wouldn’t be a one-to-one and I’m not good with regex.

image

Get-DbaProductKey

This command had been rewritten pretty much from the ground up. It no longer requires remote registry access, just SQLWMI and PowerShell Remoting. It also accepts servers from CMS natively instead of relying on the terribly named -CmsServer parameter. To find out more, check out Get-DbaProductKey

Invoke-DbaDbShrink

Invoke-DbaDbShrink now has better column names.

image

Import-DbaCsv

I’m so happy to announce that Import-DbaCsvToSql has been renamed to Import-DbaCsv and is now usable and reliable for most CSVs. The original command was ultra fast, but it came at the price of reliability. I found myself never using a fast command because it so rarely handled my imperfect data. That’s been fixed and the command is still pretty darn fast.

image

Writing more in-depth about this command is on my todo list, but I love that you can pipe in a bunch of CSVs from a directory and smash them into a database. Everything is done within a transaction, too, so if the import fails, the transaction is rolled back and no changes persist.

image

No more breaking changes for a while

K that should do it for breaking changes, at least until we’re closer to 1.0.

We’re trying our best to ensure you don’t get too frustrated while we make necessary changes, and we thank you for your patience.

Big ol’ bug bash πŸ›

The bug bash is going incredibly well! We’re now down from 90+ open bugs to just 25 minor bugs 😍 Huge thanks to everyone who has helped bring our count down to such a manageable number.

- Chrissy

]]>
https://dbatools.io/more-1-0-progress/feed/ 1 9129
sql server migration enhancements https://dbatools.io/migration-enhancements/ https://dbatools.io/migration-enhancements/#comments Mon, 24 Sep 2018 14:30:44 +0000 https://dbatools.io/?p=8387 A while back, I added some new features to our migration commands but I forgot to blog about them. Then, I used one of the new features for a fast and successful migration, got so pumped and had to share.

Multiple destinations

Now, you can migrate from one server to many. This applies to both Start-DbaMigration and all of the Copy-Dba* commands, including Copy-DbaDatabase and Copy-DbaLogin.

As you may be able to see in the title bar of this Out-GridView, I am migrating from workstation, which is a SQL Server 2008 instance, to localhost\sql2016 and localhost\sql2017. My entire command is as follows:

Start-DbaMigration -Source workstation -Destination localhost\sql2016, localhost\sql2017 -BackupRestore -UseLastBackup | Out-GridView

If you examine the results, you’ll see it migrated sp_configure, then moved on to credentials – first for localhost\sql2016 then localhost\sql2017. And continued on from there, migrating central management server, database mail, server triggers and databases.

It migrates a bunch of other things too, of course. Haven’t seen or performed a migration before? Check out this 50-second video of a migration.

-UseLastBackup

This one is awesome. Now, you can use your existing backups to perform a migration! Such a huge time saver. Imagine the following scenario:

Dear DBA,
Your mission, should you choose to accept it, is to migrate the SQL Server instance for a small SharePoint farm.
You’ll work with the SharePoint team and will have four hours to accomplish your portion of the migration.
All combined, the databases are about 250 GB in size. Good luck, colleague!

A four hour outage window seems reasonable to me! Here is one way I could accomplish this task:

  • Update the SQL Client Alias for the SharePoint Servers
  • Shut down the SharePoint servers
  • Execute my scheduled log backup job one final time
  • Perform the migration using -UseLastBackup switch
  • Boot up the SharePoint Servers
  • Check that the sites work
  • Party 🎈

I’ve actually done this a number of times, and even made a video about it a couple years ago.

The outdated output is drastically different from today’s output but the overall approach still applies. Note that back then, -UseLastBackup did not yet exist so in the video, it performs a backup and restore, not just a restore. I should make a new video.. one day!

So basically, this is the code I’d execute to migrate from spcluster to newcluster:

# first, modify the alias then shut down the SharePoint servers
$spservers = "spweb1", "spweb2", "spapp1", "spapp2"
Remove-DbaClientAlias -ComputerName $spservers -Alias spsql1
New-DbaClientAlias -ComputerName $spservers -ServerName newcluster -Alias spsql1
Stop-Computer -ComputerName $spservers

# Once each of the servers were shut down, I'd begin my SQL Server migration.
Start-DbaMigration -Source spcluster -Destination newcluster -BackupRestore -UseLastBackup | Out-GridView

# Mission accomplished πŸ•΅οΈβ€β™€οΈπŸ•΅οΈβ€β™‚οΈ

This would migrate all of my SQL Logins, with their passwords & SIDs, etc, jobs, linked servers, all that, then it’d build my last full, diff and log backups chain, and perform the restore to newcluster! Down. To. My. Last. Log. So awesome πŸ˜πŸ‘

Thanks so much for that functionality Stuart, Oleg and Simone!

What about VLDBs?

For very large database migrations, we currently offer log shipping. Sander Stad made some cool enhancements to Invoke-DbaDbLogShipping and Invoke-DbaDbLogShipRecovery recently, too.

# Also supports multiple destinations!
# Oh, and has a ton of params, so use a PowerShell splat
 $params = @{
    Source = 'sql2008'
    Destination = 'sql2016', 'sql2017'
    Database = 'shipped'
    BackupNetworkPath= '\\backups\sql'
    PrimaryMonitorServer = 'sql2012'
    SecondaryMonitorServer = 'sql2012'
    BackupScheduleFrequencyType = 'Daily'
    BackupScheduleFrequencyInterval = 1
    CompressBackup = $true
    CopyScheduleFrequencyType = 'Daily'
    CopyScheduleFrequencyInterval = 1
    GenerateFullBackup = $true
    Force = $true
}

# pass the splat
Invoke-DbaDbLogShipping @params

# And now, failover to secondary
Invoke-DbaDbLogShipRecovery -SqlInstance localhost\sql2017 -Database shipped

Other migration options such as Classic Mirroring and Availability Groups are still on the agenda.

Happy migrating!

- Chrissy

]]>
https://dbatools.io/migration-enhancements/feed/ 11 8387
a few other community tools https://dbatools.io/community-tools/ https://dbatools.io/community-tools/#comments Fri, 31 Aug 2018 10:19:29 +0000 https://dbatools.io/?p=8210 Last night’s #PSPowerHour made me realize I should highlight a few awesome projects I’ve come across recently.

PSDatabaseClone

PSDatabaseClone was created by Sander Stad.

PSDatabaseClone is a PowerShell module for creating SQL Server database images and clones. It enables administrator to supply environments with database copies that are a fraction of the original size.

It is well-documented and open-source.

dbops

dbops was created by Kirill Kravtsov.

dbops is a Powershell module that provides Continuous Integration/Continuous Deployment capabilities for SQL database deployments.

It is based on DbUp, which is DbUp is an open source .NET library that helps you to deploy changes to SQL Server databases. dbops currently supports both SQL Server and Oracle.

sqlwatch

sqlwatch was created by Marcin Gminski.

The aim of this this project is to provide a free, repository backed, SQL Server Monitoring.

The project is open-source and the developers are available on Twitter and in #sqlwatch in the SQL Server Community Slack.

PowerUpSQL

PowerUpSQL was created by Scott Sutherland.

PowerUpSQL includes functions that support SQL Server discovery, weak configuration auditing, privilege escalation on scale, and post exploitation actions such as OS command execution.

The project is open-source and was recently featured at Black Hat USA.

dbachecks

If you’re new to dbatools and not familiar with our other projects, dbachecks was created by the dbatools team and is now primarily maintained by Rob Sewell.

dbachecks is a framework created by and for SQL Server pros who need to validate their environments using crowd-sourced checklists.

The project is open-source and totally beautiful.

If you’re wondering what happened to dbareports, Rob handed it off to Jason Squires who is currently in the middle of a rewrite.

your module here

If I’ve missed your module or project, let me know in the comments and I’ll happily add it to this post!

missed pspowerhour?

Last night, was the second live stream of #PSPowerHour! Check it.

Title Name
Cloning SQL Server Databases using PowerShell Sander Stad
Getters and Setters for Classes with Custom Attributes Ryan Bartram
Using PwSH to gather information from silos Teresa Clark
PowerShell and RegExp to convert code ClΓ‘udio Silva
Getting Started with Visual Studio Code Shawn Melton
Deploying SQL code using Powershell Kirill Kravtsov
PSKoans Joel Sallow

- Chrissy

]]>
https://dbatools.io/community-tools/feed/ 3 8210
pspowerhour youtube livestream https://dbatools.io/pspowerhour/ https://dbatools.io/pspowerhour/#comments Wed, 22 Aug 2018 09:49:05 +0000 https://dbatools.io/?p=8144 Last night was the premiere of #PSPowerHour! It featured great speakers and a lot of dbatools content.

What is PSPowerHour?

Created by Michael T Lombardi and Warren F, PSPowerHour is “like a virtual User Group, with a lightning-demo format, and room for non-PowerShell-specific content. Eight community members will give a demo each PowerHour.”

Sessions are proposed and organized on GitHub, which is really cool. Both new and seasoned speakers are invited to propose topics πŸ€—

Agenda

Title Name
Default Parameter Values Chrissy LeMaire
The dime tour: PowerShell + Excel = Better Together Doug Finke
Setting Trace Flags as Startup Parameters with dbatools Andrew Wickham
Applying SQL Server Data Compression with dbatools Jess Pomfret
Better, Safer SQL Queries from PowerShell Andy Levy
Easy Desktop Notifications with BurntToast Josh King
Raspberry Pi with PowerShell and IoT module Daniel Silva

Everyone did such a great job, I love this livestream! As you can see above, the hour was opened up with my session about Default Parameter Values, which I wrote about earlier. Then Doug Fink talked about his really amazing module, ImportExcel.

Next, dbatools contributor Andrew Wickman talked about setting SQL Server trace flags using PowerShell. This session also features containers! After Andrew’s session, dbatools contributor Jess Pomfret talked about her awesome compression commands which make it really easy to enable compression within SQL Server.

Finally for dbatools content, major contributor Andy Levy talks discusses safely (and properly!) querying SQL Server using Invoke-DbaSqlQuery.

After that Josh King talked about his very impressive and beautiful module BurntToast. And the hour is wrapped up by Daniel Silva who has a fantastic demo where he controls a Raspberry Pi with PowerShell ❀

Next meeting

The next PSPowerHour will happen on August 30. The meeting’s agenda is already posted and features sessions about ChatOps, VS Code, SQL Server, getters & setters, PSKoans, PowerShell 6.0 and converting code using PowerShell.

Be sure to check it out!

- Chrissy

]]>
https://dbatools.io/pspowerhour/feed/ 1 8144
PowerShell Precon at PASS Summit 2018 https://dbatools.io/psprecon/ https://dbatools.io/psprecon/#respond Thu, 12 Apr 2018 14:01:09 +0000 https://dbatools.io/?p=7680 Congrats to our teammate Rob Sewell! Rob was invited by PASS to present about PowerShell at PASS Summit 2018 on Tuesday, November 6 2018.

In his day-long session, Rob will talk about a variety of super interesting subjects including: dbachecks, PowerShell module-making, GitHub, VSTS, and dbatools. Rob is a vibrant, knowledgeable speaker and I can’t recommend this precon enough! I learn a ton every time that Rob and I present together.

Professional and Proficient PowerShell

Professional and Proficient PowerShell: From Writing Scripts to Developing Solutions

DBA’s are seeing the benefit of using PowerShell to automate away the mundane. A modern data professional needs to be able to interact with multiple technologies and learning PowerShell increases your ability to do that and your usefulness to your company.

At the end of this fun filled day with Rob, a former SQL Server DBA turned professional automator, you will be much more confident in being able to approach any task with PowerShell and you will leave with all of the code and demos. You can even follow along if you bring a laptop with an instance of SQL Server installed.

  • How to learn how to interact with any technology using PowerShell
  • Understanding the syntax
  • The importance of Get-Help and how PowerShell enables you to help yourself
  • Why to write your own Modules and how to make them available to all of your team
  • A quick automated method to creating your module framework
  • Unit testing and debugging your code
  • How to continuously deliver changes to your PowerShell modules using GitHub and VSTS
  • Tips and tricks for script writing with the popular open-source community dbatools module
  • How to validate your SQL Server estate with PowerShell
  • Advanced SQL Server Agent and PowerShell management

We will have a lot of fun along the way and you will return to work with a lot of ideas, samples and better habits to become a PowerShell ninja and save yourself and your organisation time and effort.

Register now

To register, visit the shortlink sqlps.io/precon

- Chrissy

]]>
https://dbatools.io/psprecon/feed/ 0 7680
decreasing module import times https://dbatools.io/import-times/ https://dbatools.io/import-times/#comments Wed, 11 Apr 2018 16:05:27 +0000 https://dbatools.io/?p=7657 SQL Server Operations Studio by Microsoft is like SSMS for ops, all open source and published on GitHub! They recently updated their wiki’s Performance page, addressing why SQL Operations Studio starts up slowly. Their startup stats are pretty cool!

This screenshot reminded me that I should write about our own import time stats.

confession

You may remember years ago when I expressed how upset I was about SQLPS, SqlServer’s predecessor, taking so long to import. Five whole seconds! Thanks to the community response, Microsoft took note and now it’s down to an amazing 00:00:00.9531216!

That’s about 1 second, though often times, I’ve seen it load in 500ms. Congrats to Microsoft! I’m jealous πŸ˜‰ At the PASS Summit PowerShell Panel last year, people asked what we loved most and hated most about PowerShell. I already knew my answer for what I hated most: long import times.

And I immediately copped to my embarrassment that I complained about SQLPS taking so long to load, yet here we were, back in November 2017, taking just as long to load. Now to be fair, we support more SQL Server functionality and we also use non-compiled code (functions/ps1 vs cmdlets/C#) which makes it easier for the community to contribute. This means we can only do so much.

redemption

I asked C# wizard, Fred, how we can improve import times, and he immediately jumped on it by adding a class that breaks down how long each section takes.

You can test this yourself by importing dbatools then running:

[Sqlcollaborative.Dbatools.dbaSystem.DebugHost]::ImportTime

What you’ll notice is that on that supafast machine, dbatools is now down to about a 1.78 second load! Incredible. This is how we did it:

runspaces

We noticed that the longest part of importing the module was importing all the extra SMO DLL’s that we require for many of the commands. We import about 150 DLLs and it looks like that number will only grow as we begin to support more functionality (such as Integration services, etc.)

To address this concern, Fred added multi-threading via runspaces to our import process. Too cool! This resulted in a significant decrease in time.

allcommands.ps1

The other thing we did to significantly decrease import times was we combined all of the individual .ps1 files in functions*.ps1 to a single .ps1 file. So now, before every release, I combine all the newly updated commands, sign it using our code signing certificate, then publish it to the PowerShell Gallery.

It also means that we had to modify our import process to accommodate our developers because nobody, including me, wants to work on a 90,000 line file. We handled this by detecting if a .git folder exists in the dbatools path, and if it does, then it’ll skip allcommands.ps1 and import the individual .ps1 files in the functions directory.

The .git folder only exists when the git repository is cloned. This means it won’t exist in our module in the PowerShell Gallery or in a zip downloaded from GitHub.

Edit: I took this a step further and compressed the ps1 to a zip. Turns out it works super well! Check out the post for more information.

the future

I’ve heard that PowerShell Core (PSv6) is insanely fast. Unfortunately, SMO is not entirely ported to .NET Core, so we can’t yet support 6.0. However! SMO is the only portion of our module that is not 6.0 ready, so once SMO is ported, dbatools will be too πŸ‘. Hopefully this will result in faster load times.

your miles may vary

On my Windows 7 test box, dbatools loads in 2.3 seconds. On a more locked down Windows 2012 server, the import takes about 4-6 seconds.

Note that you should never experience import times over 20 seconds. If you do, check your Execution Policy, which could be impacting load times. dbatools is fancy and signed by a code-signing certificate; this is awesome for code integrity, but it’s also known to slow down imports when mixed with certain Execution Policies.

- Chrissy 🍟

]]>
https://dbatools.io/import-times/feed/ 1 7657
three ways to track logins using dbatools https://dbatools.io/track-logins/ https://dbatools.io/track-logins/#comments Tue, 10 Apr 2018 12:00:50 +0000 https://dbatools.io/?p=7432 Years ago, I wrote Watch-DbaDbLogin which keeps an inventory of accounts, hosts and programs that log into a SQL Server. It was pretty crude, but helped immensely during my migration, as this inventory ensured that my documentation was in order and no unexpected downtime would occur.

I found that about 80-90% of logins/applications were covered within 48-hours, but two months of data gave me total confidence.

I always wanted to update the command, though I’m not sure Watch-DbaDbLogin is still within the scope of the module. It’ll likely remove it in dbatools 1.0 so please accept this far cooler post in its place.

There are several ways to capture logins, all with their own pros and cons. In this post, we’ll outline four possibilities: default trace, audits, extended events and session enumeration.

Note: The code in this post requires dbatools version 0.9.323. I found two bugs while testing sample scenarios 😬 Also, this post addresses tracking logins for migration purposes, not for security purposes. Edit the where clauses as suitable for your environment.

Using a default trace

Using the default trace is pretty lightweight and backwards compatible. While I generally try to avoid traces, I like this method because it doesn’t require remote access, it works on older SQL instances, it’s accurate and reading from the trace isn’t as CPU-intensive as it would be with an Extended Event.

Setup the SQL table

Basically, no matter which way you track your logins, you’ll need to store them somewhere. Below is some T-SQL which sets up a table that is ideal for bulk importing (which we’ll do using Write-DbaDataTable).

The table is created with an index that ignores duplicate sessions. When IGNORE_DUP_KEY is ON, a duplicate row is simply ignored. So we’re going to setup a clustered index using SqlInstance, LoginName, HostName, DatabaseName, ApplicationName and StartTime. Then the collector will send a bunch of rows via bulkcopy to the table, and the table will ignore the dupes.

To clarify, “duplicate” logins may show up, but not duplicate sessions. Watch-DbaDbLogin only recorded the first time it ever saw a login/db/host/app combination which many people found to be less useful, especially if you run the login tracker for years. What if a login became stale?

If you’d like the first login only, remove StartTime ASC from the index.

Setup the default trace

Setup the collector

Next, you’ll want to setup a collector as a scheduled SQL Agent Job.

How often should you run the job? It depends. I have one server that has login information going back to November. But I’ve found that SharePoint or System Center dedicated instances only have about 20 minutes worth of login data in the default trace.

How long does the collection take? Polling 15 servers took 14 seconds to read 55,000 records and 18 seconds to write that data. Of the 55,000 records, only 115 were unique!

Using a SQL Server Audit

Audits are cool because audits can “force the instance of SQL Server to shut down, if SQL Server fails to write data to the audit target for any reason”. This ensures that 100% of your logins are captured. But my requirements for collecting migration information aren’t that high and I haven’t found the magical Audit Spec that only logs what I need. Here’s what the .sqlaudit file for SUCCESSFUL_LOGIN_GROUP looks like when you rename it to .xel and open it.

Eh, I’m missing so much stuff. And since Audits are Extended Events anyway, and I have more control over what I do and don’t want to see, we’ll skip right to Extended Events.

Using Extended Events

You can also use Extended Events. This option is pretty cool but collecting the data does require UNC access for remote servers.

Setup the SQL table

Login Tracker template

We’ve provided a “Login Tracker” Extended Event session template that you can easily add to your estate.

This template creates a session that:

  • Is initiated by sql_statement_starting event
  • Collects the minimum possible columns
  • Ignores connections from dbatools and SSMS
  • Ignores queries to tempdb
  • Ignores system queries
  • Keeps 50 MB of data on disk (10×5)

I chose sql_statement_starting because it’s the only one that I found that actually included the database name. If this doesn’t work for you, you can modify then export/import the modified Session. If you have a better suggestion, I’d love that. Please let me know; I kinda feel like this one is overkill.

Setup the XESession

Setup the collector

UNC access

So instead of placing the burden of XML shredding on the CPU of the destination SQL instance, Read-DbaXEFile uses the local resources. It does this by using the RemoteTargetFile which is available in Get-DbaXESession but is not a default field. To unhide non-default fields, pipe to SELECT *.

Keep in mind that the entire file is read each time you enumerate. Which is not a big deal, but should be considered if you have millions of logins.

Note that I did set a max on the Login Tracker file size to 50 MB so if you want to modify that, you can use PowerShell or SSMS (Instance ➑ Management ➑ Extended Events ➑ Sessions ➑ Login Tracker ➑ right-click Properties ➑ Data Storage ➑ Remove/Add). There is no dbatools command available to do this in PowerShell yet, so you’ll have to do it manually until it’s added.

Using session enumeration

This one requires no setup at all, but only captures whoever is logged in at the time that you run the command. This approach is what I originally used in Watch-DbaDbLogin (scheduled to run every 5 minutes) and it worked quite well.

So if you’ve never seen the output for Get-DbaProcess, which does session enumeration, it’s pretty useful. If you’d like something even more lightweight that still gives you most of the information you need, you can use $server.EnumProcesses()

Actually, scratch all that. Let’s go with some lightweight, backwards-compatible T-SQL that gets us only what we need and nothing more. Honestly, of all the ways, I’ve personally defaulted back to this one. It’s just so succinct and efficient. There is the possibility that I’ll miss a login, but this isn’t a security audit and really, I inventoried 100% of the logins I needed for my last migration.

Setup the SQL table

Setup the collector

Testing your results

If you’re testing the scripts on a non-busy system like I did, you may not get any results back because we’re ignoring connections from dbatools and SQL Server Management Studio.

If you’d like to ensure some results, just run this before performing a collection. This connects to SQL Server using a fake client name and performs a query that gathers database names.

Scheduling

To schedule the collection, you can use my favorite method, SQL Server Agent. I wrote about this in-depth in a post, Scheduling PowerShell Tasks with SQL Agent.

During my own migration, I used session enumeration and setup the collector to run every 5 minutes. With Traces or Extended Events, you can collect the logins far less frequently since they are stored on the remote server.

Hope this was helpful!
Chrissy

]]>
https://dbatools.io/track-logins/feed/ 12 7432