Welcome to a quick post that should help you operate your SQL Server environment more consistently and reduce manual, repetitive work.
When you are running SQL Server Availability Groups, one of the most cumbersome tasks is to ensure that all logins are synchronised between all replicas. While not exactly rocket science, it is something that quickly means a lot of work if you are managing more than one or two Availability Groups.
Wouldn’t it be nice to have a script that is flexible enough to
Well, dbatools to the rescue again.
With dbatools, such a routine takes only a few lines of code.
The below script connects to the Availability Group Listener, queries it to get the current primary replica, as well as every secondary replica and then synchronizes all logins to each secondary.
In the template code, no changes are actually written due to the -WhatIf switch, so that you can safely test it to see what changes would be committed.
<# Script : SyncLoginsToReplica.ps1 Author : Andreas Schubert (http://www.linkedin.com/in/schubertandreas) Purpose: Sync logins between all replicas in an Availability Group automatically. -------------------------------------------------------------------------------------------- The script will connect to the listener name of the Availability Group and read all replica instances to determine the current primary replica and all secondaries. It will then connect directly to the current primary, query all Logins and create them on each secondary. Attention: The script is provided so that no action is actually executed against the secondaries (switch -WhatIf). Change that line according to your logic, you might want to exclude other logins or decide to not drop any existing ones. -------------------------------------------------------------------------------------------- Usage: Save the script in your file system, change the name of the AG Listener (AGListenerName in this template) and schedule it to run at your prefered schedule. I usually sync logins once per hour, although on more volatile environments it may run as often as every minute #> # define the AG name $AvailabilityGroupName = 'AGListenerName' # internal variables $ClientName = 'AG Login Sync helper' $primaryInstance = $null $secondaryInstances = @{} try { # connect to the AG listener, get the name of the primary and all secondaries $replicas = Get-DbaAgReplica -SqlInstance $AvailabilityGroupName $primaryInstance = $replicas | Where Role -eq Primary | select -ExpandProperty name $secondaryInstances = $replicas | Where Role -ne Primary | select -ExpandProperty name # create a connection object to the primary $primaryInstanceConnection = Connect-DbaInstance $primaryInstance -ClientName $ClientName # loop through each secondary replica and sync the logins $secondaryInstances | ForEach-Object { $secondaryInstanceConnection = Connect-DbaInstance $_ -ClientName $ClientName Copy-DbaLogin -Source $primaryInstanceConnection -Destination $secondaryInstanceConnection -ExcludeSystemLogins -WhatIf } } catch { $msg = $_.Exception.Message Write-Error "Error while syncing logins for Availability Group '$($AvailabilityGroupName): $msg'" }
To make tools reusable, you could easily turn this script into a function by adding the 2 variables as parameters. Then you could call it from any other script like
SyncLoginsToReplica.ps1 -AvailabilityGroupName YourAGListenerName -ClientName "Client"
For simplicity, I created this as a standalone script though.
I hope you find this post useful. For questions and remarks please feel free to message me!
]]>Hey all, I am Andreas Schubert and I am working as a Principal Consultant and Database Reliability Engineer for SQL Server & Azure for multiple national and international companies. My focus is on implementing and operating complex 24/7 SQL environments with tens and hundreds of servers and multi-terrabyte databases.
With the multitude of environments that I am operating, it’s impossible to remember every server, every database or the multiple different ways they are interacting with each other. Therefore, one of the first things I do when taking over a consulting engagement is mapping out all those different bits of information.
Since the environments usually change pretty fast, my goal is to automate this process as much as possible.
In this series of posts, I will try to show you how I am implementing this. Of course, your requirements or implementations may differ, but hopefully this blog post can give you some ideas about your tasks too.
Before dbatools existed, I had to rely on either the various monitoring solutions that my customers are using or on scripts created by myself. There are a lot of really great 3rd party tools out there that do an awesome job. Unfortunately, they all differ in how they are used or what information they report back. I needed something that is easy to implement, with as few dependencies as possible and works across all SQL Server versions. That’s when I started using dbatools.
I immediately felt in love with how flexible it is. And boy, did its functionality grow fast!
Today, there are tons of commands available that cover almost all, of the various, areas SQL Server has to offer.
Before I dive into specific SQL Servers for in-depth analysis, I want to see some sort of inventory. The minimum information I would like to collect is:
Whoever has built an inventory script in the past knows that collecting the above information requires quite a few scripts. On top of that, the underlying DMVs have been changed between SQL versions, so you need to account for that. Microsoft has made that much easier with providing SMO (SQL Server Management Objects), a set of libraries that abstract away the complexity of collecting that information. Thankfully, Microsoft also enabled the dbatools team to include SMO in their framework. My example solution relies solely on dbatools (which works – not only but also – with SMO).
OK, enough talk, let’s jump straight into the code.
Right at the beginning of any of my scripts, I am defining the root of the script itself. I do this because I re-use a lot of functions.
Since we want to collect the information for more than one SQL Server instance, we will first build a list of SQL Servers. We could query the list of instances from a central management server, but for the purpose of this post – and portability- we will keep it simple. We will also assume the account executing this script will have sufficient permissions on each SQL Server instance and that it can connect via Windows Authentication. I generally prefer Windows Authentication over SQL Server authentication due to security concerns, but that is a completely separate topic.
The names of the Servers will be coming from a simple text file in our example. Just do me a favour and do NOT put your server list into an unsecured network location – again, we need to keep security in mind.
Next, we need to load this file into our PowerShell session. For the sake of simplicity, I am loading it explicitely into my script. Normally, I have a variable populated with the servers in my profile, so I don’t have to do this each time.
$script:root = 'D:\AdminScripts' $ProductionServers = Get-Content (Join-Path $script:root -ChildPath 'Production.txt')
Similar to the actual server list, I am using a text file “AliasList.txt” to store the alias information i mentioned above:
It´s the same system: the name of the server or instance, followed by the alias name. Both values are separated by a semicolon. Loading and storing the alias information in a hash table is a simple one-liner in PowerShell:
$AliasList = Get-Content (Join-Path $script:root -ChildPath 'AliasList.txt') | Select @{Name= "Instance";Expression={$_.ToString().Split(';')[0]}},@{Name= "Alias";Expression={$_.ToString().Split(';')[1]}}
Since I usually exclude system databases from my reports, I am defining a separate list of them as well for easier reuse:
$systemDBs = "master","model","msdb","tempdb", "ReportServer","ReportServerTempDB"
At this point, we have all the preliminaries completed: A list of SQL Servers to query, a list of system databases that we will exclude and a list of alias information. Let’s hit the servers and put the resulting data into a variable. I’ll first show the complete code block, then we will talk about what it does.
$rawData = $ProductionServers | Connect-DbaInstance | Sort-Object Computername | Select-Object ComputerName, # map the SQL version @{Name="SQL Version";Expression={ if ($_.VersionMajor -eq "11") {"SQL 2012"} elseif ($_.VersionMajor -eq "12") {"SQL 2014"} elseif ($_.VersionMajor -eq "13") {"SQL 2016"} elseif ($_.VersionMajor -eq "14") {"SQL 2017"} elseif ($_.VersionMajor -eq "15") {"SQL 2019"} elseif ($_.VersionMajor -lt "11") {"SQL 2008R2 or older"} else {"unknown"}}}, ProductLevel, Edition, # RAM @{Name= "Memory (GB)";Expression={[math]::Round(($_.PhysicalMemory) / 1024)}}, Processors, InstanceName, # total count of user dbs @{Name= "User DBs";Expression={($_.Databases | where {$_.Name -notin $systemDBs} | Measure).Count}}, # total db size for all user dbs @{Name= "Total DB Size (GB)";Expression={[math]::Round(($_.Databases | where {$_.Name -notin $systemDBs} | Select size | Measure -Property Size -sum | Select sum).sum / 1024)}}, # biggest DB (name, Size(GB) @{Name= "Biggest DB (GB)";Expression={"$($_.Databases | where {$_.Name -notin $systemDBs} | Sort Size -Descending | Select -ExpandProperty Name -First 1) ($([math]::Round(($_.Databases | where {$_.Name -notin $systemDBs} | Sort Size -Descending | Select -ExpandProperty Size -First 1)/1024)) GB)"}}, # add the name of the Availability Group (if any) @{Name= "AG (s)";Expression={$_ | Select -ExpandProperty AvailabilityGroups | Select -ExpandProperty AvailabilityGroupListeners}}, # add the current role of the server in the Availability Group (if any) @{Name= "Role (s)";Expression={$_ | Select -ExpandProperty AvailabilityGroups | Select -ExpandProperty LocalReplicaRole}}, ClusterName | Sort ComputerName
While this code may look complex, from a PowerShell point of view it’s really pretty simple. First, we take the list of our Productionservers
and pipe it to Connect-DbaInstance cmdlet. Connect-DbaInstance is the result of dbatools calling the SMO functionality, returning a complete SMO object of the SQL Server connected to.
Technically, the part with $rawData = $ProductionServers | Connect-DbaInstance
already gives us all the information we need for our report. But since we don’t want to return all the possible SMO properties and objects (that would result in a very long operation), we pipe the results of this directly to a Sort, followed by returning the actual information we are interested in:
$rawData = $ProductionServers | Connect-DbaInstance | Sort Computername | Select ComputerName,
First we extract the Computername. On the next two lines, we map the SQL Server major version number to a clear-text string:
# map the SQL version @{Name="SQL Version";Expression={ if ($_.VersionMajor -eq "11") {"SQL 2012"} elseif ($_.VersionMajor -eq "12") {"SQL 2014"} elseif ($_.VersionMajor -eq "13") {"SQL 2016"} elseif ($_.VersionMajor -eq "14") {"SQL 2017"} elseif ($_.VersionMajor -eq "15") {"SQL 2019"} elseif ($_.VersionMajor -lt "11") {"SQL 2008R2 or older"} else {"unknown"}}},
Right on, we extract the product level (e.g. RTM, SP1, …) and the Edition of the SQL Server (Standard, Enterprise…), followed by the available machine memory. Since this is returned in MB, we format and round it so that we get a nice number in GB (e.g. 12, 48 or 128).
Then we include the number of logical processors and the name of the instance – in case we have a named instance.
The number and size of user databases is a bit more complex. We need to query the “Databases” collection of the SMO Server object, filter out the system databases, get the size property of each object in the collection and measure it (count for the number and SUM for the combined size of the databases). Of course, we want those numbers to be nicely formatted and rounded to the full GB, so we add the formatting as well:
# total count of user dbs @{Name= "User DBs";Expression={($_.Databases | Where {$_.Name -notin $systemDBs} | Measure).Count}}, # total db size for all user dbs @{Name= "Total DB Size (GB)";Expression={[math]::Round(($_.Databases | Where {$_.Name -notin $systemDBs} | Select size | Measure -Property Size -sum | Select sum).sum / 1024)}},
To get the size of the biggest / largest database on the server, we use the same technique, only that we sort the database object list by size in descending order and take only the first object:
# biggest DB (name, Size(GB) @{Name= "Biggest DB (GB)";Expression={"$($_.Databases | Where {$_.Name -notin $systemDBs} | Sort Size -Descending | Select -ExpandProperty Name -First 1) ($([math]::Round(($_.Databases | Where {$_.Name -notin $systemDBs} | Sort Size -Descending | Select -ExpandProperty Size -First 1)/1024)) GB)"}},
The PowerShell pipeline can be really awesome!
Now let’s add the information about the Availability Group Listener to our list. The AG information is another sub-object of the SMO collection which we first have to extract to get to the information below.
# add the name of the Availability Group (if any) @{Name= "AG (s)";Expression={$_ | Select -ExpandProperty AvailabilityGroups | Select -ExpandProperty AvailabilityGroupListeners}},
And exactly the same way for the role of the current replica as well as the cluster object name:
# add the current role of the server in the Availability Group (if any) @{Name= "Role (s)";Expression={$_ | Select -ExpandProperty AvailabilityGroups | Select -ExpandProperty LocalReplicaRole}}, ClusterName | Sort ComputerName
That’s a whole lot of information retrieved by just one call to a PowerShell method. How awesome is that? Most of our script is logic around formatting and extracting information from sub-properties and objects.
And finally, we add the Alias to our result, matching them by computername:
# add the alias to the rawdata $rawData | % { $v = $_.ComputerName if ('' -ne $_.InstanceName){$v +="\$($_.InstanceName)"} $alias = $AliasList | Where {$_.Instance -eq $v } | Select -ExpandProperty Alias -First 1 $_ | Add-Member -MemberType NoteProperty -Name AliasName -Value $alias }
The last step is to convert our object to HTML and add some css styling to it. Then we can either send it via eMail or store the generated html as a file for future reference.
$css = Get-Content (Join-Path $script:root -ChildPath 'css.txt') $html = $rawData | ConvertTo-Html -Fragment -PreContent "$($css)<h2>Instance KPI Summary</h2>" -PostContent "This summary has been generated with the help of the awesome PowerShell module dbatools!" | Out-File (Join-Path $script:root -ChildPath 'result.html')
The result is a nicely formatted html report:
This was only a very basic example of what you can do with PSTools, PowerShell and a bit of magic piping. I hope you found this useful.
For questions and remarks please feel free to message me at any time!
You can find the complete script in my GitHub repo.
]]>