Execute SQL Statements Against Multiple SQL Servers Simultaneously Using PowerShell (On-Premises)

My customer keeps me busy writing automation scripts for various requirements . These days I am helping them to build a centralized repository for their on-premises SQL servers . To start with , I want to collect instance names and versions of SQL server installed in their environment ( quite a few servers have multiple SQL servers installed ) . So I want to be ready with a script which can query all SQL instances in a windows server ,and yes of course will be very helpful for them if I can make the script friendly for any T-SQL queries.

Thinking of this I got few options like querying WMI , using Microsoft MAP toolkit ,querying registry etc.. And today I am going to explore the registry as I feel its more reliable than WMI in many ways ( I do not want to use MAP toolkit as I want to avoid bringing another application into our customer infrastructure and run behind the security clearance , IT approvals etc..)

# Building a PowerShell object for easy presentation and display output in a tabular format in csv
$temp = New-Object -TypeName psobject
$temp | Add-Member -MemberType NoteProperty -Name HostName -Value $null
$temp | Add-Member -MemberType NoteProperty -Name InstanceName -Value $null
$temp | Add-Member -MemberType NoteProperty -Name SQLVersion -Value $null

# list of HostNames.
$hostnames = get-content "D:\Work\PowerTest\Servers_list.txt"  

Step1. Loop through the host names .

foreach ($hostname in $hostnames)
{
$tempobj = $temp | Select-Object *

Step2. Query the registry to find out all SQL instances configured .

#traversing registry of the remote host to identify the sql instances installed
#ensure that you have the right to access the resgistry in remote machines

$SQLInstances = Invoke-Command -ComputerName $hostname {
(Get-ItemProperty 'HKLM:\SOFTWARE\Microsoft\Microsoft SQL Server').InstalledInstances }
foreach ($sql in $SQLInstances)
{

Step3. Build a connection to each instance and execute the T-SQL .

#registry will give me the default instance name as "MSSQLSERVER" which needs to be changed for connection
if($sql -eq 'MSSQLSERVER')
{$instance = $sql.PSComputerName
$tempobj.InstanceName = $instance}

#concatenating hostname with instance name to get the correct connetion object
else { $instance = $sql.PSComputerName,$sql -join"\"
$tempobj.InstanceName = $instance}

Step4. Fetch the output of queries in table format in a csv file .

$result = Invoke-Sqlcmd -ServerInstance $instance -Database master -Query "select @@version as 'Version'"
$tempobj.SQLVersion = $result.Version
$tempobj.HostName = Hostname
$tempobj| Export-Csv "D:\Work\PowerTest\instance_details.csv" -NoTypeInformation -append
}}
Final Script:

# Building a PowerShell object for easy presentation and display output in a tabular format in csv
$temp = New-Object -TypeName psobject
$temp | Add-Member -MemberType NoteProperty -Name HostName -Value $null
$temp | Add-Member -MemberType NoteProperty -Name InstanceName -Value $null
$temp | Add-Member -MemberType NoteProperty -Name SQLVersion -Value $null
$hostnames = get-content "D:\Work\PowerTest\Servers_list.txt"

# looping through the host names provided 
foreach ($hostname in $hostnames)
{
$tempobj = $temp | Select-Object *

#traversing registry of the remote host to identify the sql instances installed
$SQLInstances = Invoke-Command -ComputerName $hostname {
(Get-ItemProperty 'HKLM:\SOFTWARE\Microsoft\Microsoft SQL Server').InstalledInstances }
foreach ($sql in $SQLInstances)
{
#registry will give me the default instance name as "MSSQLSERVER" which needs to be changed for connection
if($sql -eq 'MSSQLSERVER')
{$instance = $sql.PSComputerName
$tempobj.InstanceName = $instance}

#concatenating hostname with instance name to get the correct connetion object
else { $instance = $sql.PSComputerName,$sql -join"\"
$tempobj.InstanceName = $instance}

$result = Invoke-Sqlcmd -ServerInstance $instance -Database master -Query "select @@version as 'Version'"
$tempobj.SQLVersion = $result.Version
$tempobj.HostName = Hostname
$tempobj| Export-Csv "D:\Work\PowerTest\instance_details.csv" -NoTypeInformation -append
}}

Notes

  • I have not used any error handling mechanism as its purely on consumers need . ( perhaps you can add couple of TRY CATCH )
  • You can pull a lot of information from SQL server instances , its just that I need versions . Therefore you will have to handle the output and present accordingly.
  • Ensure that you have enough permission to read the registry.

Execute SQL scripts against ALL databases / in ALL SQL servers / in ALL resource groups / in A subscription.( SQL authentication)

This time I was developing an automation framework for one of my customer . I was told to use “Azure SPN” for database authentication which brings few interesting security benefits like refraining interactive logins etc..To accomplish this we need a user in database ( every database where you need to authenticate) for SPN to authenticate using azure active directory .

But challenge in here is user needs to be created on 1000+ databases spread across multiple resource groups and SQL servers . One way to do this is using “azure run book” ,but in most of the enterprise environments database management team wouldn’t have access to create or manage runbooks . Hence they are heavily dependent on another team .

To overcome this dependency I came up with a powershell script which can be used in situations where we can bulk push one or more T-SQL to multiple databases at a time . I have used “Get-azureRmResourceGroup” which gives me the list of resource groups in a specific subscription and “Get-AzureRmSqlServer” gives me the available sql servers in a specific resource group . So I have piped “Get-AzureRmResourceGroup” with “Get-AzureRmSqlServer” to get the list of SQL servers provisioned in a subscription.

Final script :

#Provide your tenant_id and subscription_id
Connect-AzAccount -Tenant "<tenant_id>" -SubscriptionId "subscription_id"

#Identifying all sql servers in connected subscription 
$servernamesall = Get-AzureRmResourceGroup | Get-AzureRmSqlServer

#Collecting unique resource group names where ONLY sql servers are hosted (im not bothered about resource group which doesn't have SQL servers)
$resourcegroups = $servernamesall.ResourceGroupName | Get-Unique

#Looping through each resource group collected from above #parent_loop
foreach($resourcegroup in $resourcegroups)
{
Write-Output "------------------------------"
Write-Output "Fetching from resource group : $($resourcegroup)"
$serverlistraw = Get-AzureRmSqlServer $resourcegroup
$serverlists = $serverlistraw.ServerName

#Looping through each sql server in the resource group #child_loop1
foreach ($serverlist in $serverlists)
{
$serverlist=$serverlist+".database.windows.net"
Write-Output "Fetching from server:$($serverlist)"

#Excluding datawarehouse pools .
$databases = Invoke-Sqlcmd -ServerInstance $serverlist -Username username -password "<password>" -Database Master -Query "select sd.name from sys.databases sd join [sys].[database_service_objectives] sdo
on sd.database_id = sdo.database_id where sd.database_id <> 1 and sdo.edition not like 'DataWarehouse'"

#Looping through each database collected from sys.databases table #child_loop2
foreach ($database in $databases) 
{
#T-SQL which needs to be executed in each database
#Im constructing another connection here as cross database query is not possible in SQL PaaS
$arrayforoutput = @()
Write-Output "Executing against database:$($database.name)"
$output = Invoke-Sqlcmd -ServerInstance $serverlist -Username username -password "<password>" -Database $database.name -Query "declare @servernm nvarchar(20),
@dbname nvarchar(20)
select @servernm = @@servername
select @dbname = db_name()
select 'Script executed in server '+@servernm +' against a database '+@dbname"
Write-Output $($output.Column1)
$arrayforoutput += $output
}
#If you want to write your ouput in a text file
$arrayforoutput | format-table -HideTableHeaders | Out-File -FilePath C:\powershell_output\query_output.txt -append
}
}

Notes:

  • Here I have used SQL authentication to run the script , in case if you want to use Azure AD account then you may have to use function
    Get-QueryExecutionOutput” from my previous blog.
  • Error handling is very minimal in this script , you may need to use a better handling perhaps a try/catch ? in case if you need to debug a failed query or a loop .
  • I have used single subscription here for easiness of testing ( I have only one ) . If you have multiple subscriptions you can use one more loop to go through a list of subscriptions from source text / excel .
  • Instead of querying “sys.databases” we can use “Get-AzSqlDatabase” as well to loop , however it would be bit clumsy if you want to apply any filter .

Execute a SQL script against a set of azure PaaS databases listed in an excel file. ( Azure AD authentication)

One of my customer has varieties of applications hosted in hybrid environment( on-prem and cloud) predominantly on azure platform and a significant effort is been put to move the back end platform to azure where ever its viable . This pushed me to develop quite a few automation scripts which helped us to get the things done with minimal manual work.

Recently I was pulled into develop a script which should convert a PaaS database tier from DTU to vCore for whatever the business reasons they found . Let me tell you I am talking about 5000+ databases what they are targeting and spread across prod to non-prod . So it’s impossible to accomplish this task using azure portal / UI . Hence I may have to use the script and scale up the execution in multiple databases .

We had brainstorming hours to get the requirements as well as to make our road map vivid , of course we decided to go for PowerShell without much hesitation !!!

  1. Its easily do-able using the DDL “”alter database <database_name>  modify(edition = ‘GeneralPurpose’ ,service_objective = ‘<target_tier>’)” . Let’s see how can I execute this in a loop .
  2. Need a brief downtime during this activity ( during the process when a database gets migrated from DTU to vCore) , so the duration of execution may depends on region , off-business hours etc..(how busy your database is when you migrate)
  3. One approach I considered for developing script was to get the databases names using “sys.databases” from master database and pass it in a loop to execute the migration script. But this is not possible due to point number “2” as they have a mixed collection of databases hosted in a single logical server.
  4. Customer can provide me an excel with server names , database names and corresponding vCore tier model what we are targeting ( followed a rule of thumb, every 100 DTUs in the standard tier require at least 1 vCore, and every 125 DTUs in the premium tier require at least 1 vCore). We picked this approach as we have more control in the activity .
Final script :

# Defined this function to build the connection string using ADO.Net client # reason being Microsoft has already disclosed that there is no plan        # dropping support for System.Data.SqlClient any time soon .
# I want to use more secured AAD account for authentication 
# Note : invoke-sqlcmd won’t support Azure Active Directory authentication as of 3/16/2020 .

function Get-QueryExecutionOutput ($SQLconnectionstring, $SQLquery) {
$Conn = New-Object System.Data.SqlClient.SqlConnection($SQLconnectionstring)
$Conn.open()
$read = New-Object System.Data.SqlClient.SqlCommand($SQLquery,$Conn)
$read.CommandTimeout = "300"
$da = new-object System.Data.SqlClient.SqlDataAdapter($read)
$dt = New-Object system.Data.datatable
[void]$da.fill($dt)
$Conn.close()
return $dt
}

# My source excel path , change appropriately with yours
$excelfile = "C:\Array\vCore.xlsx"
$sheetgiven = "Sheet1"


# Create Excel object
$excelCOM = New-Object -com Excel.Application
#$excelCOM.Visible = $false

#opening the source file and reading the data from the sheet name assigned  variable
$workbook = $excelCOM.workbooks.open($excelfile)
$sheet = $workbook.Worksheets.Item($sheetgiven)

#select total rows
$maxrows = ($sheet.UsedRange.Rows).Count

#assign new powershell-object with values like ServerName , DatabaseName and Tergetmodel
$powershellobject = New-Object -TypeName psobject
$powershellobject | Add-Member -MemberType NoteProperty -Name ServerName -Value $null
$powershellobject | Add-Member -MemberType NoteProperty -Name DatabaseName -Value $null
$powershellobject | Add-Member -MemberType NoteProperty -Name Targetmodel -Value $null
$outputarray = @()
#$i =2 , because im skipping column names
#for loop  will iterate till we reach max row count
for ($i = 2; $i -le $maxrows; $i++)
{
    $adhocobject = $powershellobject | Select-Object *
    $adhocobject.ServerName = $sheet.Cells.Item($i,1).Text
    $adhocobject.DatabaseName = $sheet.Cells.Item($i,2).Text
    $adhocobject.Targetmodel = $sheet.Cells.Item($i,3).Text
    $servername = $adhocobject.ServerName
    $Databasename = $adhocobject.DatabaseName
    $Targetmodel = $adhocobject.Targetmodel
    $SQLquery = "alter database $Databasename  modify(edition = 'GeneralPurpose' ,service_objective = '$Targetmodel')"
    $SQLconnectionstring = "Data Source=tcp:$servername;Initial Catalog=master;Authentication=Active Directory Password;User id=<>;Password=<>;Encrypt=True;TrustServerCertificate=True;Connection Timeout=30;"

# function call to execute the query against the dynamically built connection string .
Get-QueryExecutionOutput "$SQLconnectionstring" $SQLquery 
#creates a new array with the same elements as old one + the new item, and this new larger array replaces the old one in the $outputarray variable         
$outputarray += $adhocobject
}
$excelCOM.quit()
get-process *excel* | stop-process -force  

Expectations from DBA :

  • Install required Azure PowerShell modules .
  • $file = “C:\Array\vCore.xlsx” your excel file .
  • Should have only 3 columns ( ServerName , DatabaseName ,Targetmodel ) should be in same order . You are expected to fill servername , DatabaseName and Target model .
  • No null values are accepted in the excel sheet .
  • You may have to filter out elastic pool databases from the list.
  • If you want a detailed output then you may define a variable to get the output from “Get-QueryExecutionOutput” function . Then this variable content can be written to a text file using “Out-File” and -append . ( within for-each loop)