My last post covered the SQL Assessment extension in Azure Data Studio and in this article I'm going to look at running the SQL Assessment API using PowerShell instead. Whilst in ADS we can export assessment results to HTML or into SQL via a script the PowerShell option is a bit more flexible, we can export straight to a SQL table and run the assessment across multiple instances.
I'll follow the PowerShell instructions from the SQL Assessment API web page; this is the code that will run the assessment on my localhost instance and it will also write the results to a table (Assessment.Results) in the SQLAssessmentDemo database, the -Force parameter will create the objects if they don't already exist:
Get-SqlInstance -ServerInstance 'localhost' |
Invoke-SqlAssessment -FlattenOutput |
Write-SqlTableData -ServerInstance 'localhost' -DatabaseName SQLAssessmentDemo -SchemaName Assessment -TableName Results -Force
If required we're not just limited to one instance, in the following script I can run the assessment against my default instance and a named instance (localhost\MSSQLSERVER01).
Get-SqlInstance -ServerInstance 'localhost', 'localhost\MSSQLSERVER01' |
Invoke-SqlAssessment -FlattenOutput |
Write-SqlTableData -ServerInstance 'localhost' -DatabaseName SQLAssessmentDemo -SchemaName Assessment -TableName Results -Force
But for added flexibility I can store the names of the SQL instances in a text file and loop through the content using the following:
$servers = Get-Content -Path C:\SQL\SQLServers.txt
ForEach ($server in $servers)
{
Get-SqlInstance -ServerInstance $server |
Invoke-SqlAssessment -FlattenOutput |
Write-SqlTableData -ServerInstance 'localhost' -DatabaseName SQLAssessmentDemo -SchemaName Assessment -TableName Results -Force
}
The scripts will export the assessment data into the specified table where the check information is stored, along with items such as a timestamp, the ruleset being used and the help link. One thing that is missing is the tag which is available in ADS which I think would be really useful when querying the data.
One column of note is the TargetType, in ADS when the assessment is ran it returns checks both the instance and each of the databases but in PowerShell the server and database assessments need to be ran as separate scripts, here's the database level one which is using Get-SqlDatabase as opposed to Get-SqlInstance:
$servers = Get-Content -Path C:\SQL\SQLServers.txt
ForEach ($server in $servers)
{
Get-SqlDatabase -ServerInstance $server |
Invoke-SqlAssessment -FlattenOutput |
Write-SqlTableData -ServerInstance 'localhost' -DatabaseName SQLAssessmentDemo -SchemaName Assessment -TableName Results -Force
}
And of course we can combine the server and database level scripts into one which will gather both sets of assessment results:
$servers = Get-Content -Path C:\SQL\SQLServers.txt
ForEach ($server in $servers)
{
Get-SqlInstance -ServerInstance $server |
Invoke-SqlAssessment -FlattenOutput |
Write-SqlTableData -ServerInstance 'localhost' -DatabaseName SQLAssessmentDemo -SchemaName Assessment -TableName Results -Force
Get-SqlDatabase -ServerInstance $server |
Invoke-SqlAssessment -FlattenOutput |
Write-SqlTableData -ServerInstance 'localhost' -DatabaseName SQLAssessmentDemo -SchemaName Assessment -TableName Results -Force
}
Now the data is stored in SQL we can look at the results, for my two instances I have 1213 rows of assessment data to look through which is a lot. To filter on which instance I return assessment results for I can use the TargetPath column (I'm having to double up on the single quotes to avoid a syntax error):
SELECT * FROM
[SQLAssessmentDemo].[Assessment].[Results]
WHERE TargetPath = 'Server[@Name=''DESKTOP-MFKSHR7'']'
If I wanted to filter on just the server level assessment items then I can use the following:
SELECT * FROM
[SQLAssessmentDemo].[Assessment].[Results]
WHERE TargetPath = 'Server[@Name=''DESKTOP-MFKSHR7'']'
AND TargetType = 'Server'
Filtering for groups of items is a little more difficult without the Tag column which is available in Azure Data Studio. As the assessment has a rule for individual SQL Agent alerts based on their severity then we need to use a LIKE clause to return all the relevant results:
SELECT * FROM
[SQLAssessmentDemo].[Assessment].[Results]
WHERE TargetPath = 'Server[@Name=''DESKTOP-MFKSHR7'']'
AND TargetType = 'Server'
AND CheckId LIKE 'AgentAlerts%'
The recommendation itself is stored in the Message column, here's the field content for the AgentAlertsSeverity10 CheckId row (that's a lot of alerts):
Create alerts for errors: 825, 833, 855, 856, 3452, 3619, 17179, 17883, 17884, 17887, 17888, 17890, 28036
The chances are you have one than one database on an instance of SQL Server and the database name of each assessment rule is stored in the TargetType column, such as; Server[@Name='DESKTOP-MFKSHR7']/Database[@Name='WideWorldImporters'] and again we'd need to double quote when filtering on the value in SQL. The following query will return all the database assessment results for the WideWorldImporters database:
SELECT * FROM [SQLAssessmentDemo].[Assessment].[Results]
WHERE TargetType = 'Database'
AND TargetPath = 'Server[@Name=''DESKTOP-MFKSHR7'']/Database[@Name=''WideWorldImporters'']'
Database checks are very comprehensive, particularly because of the amount of checks on indexes that might be deemed unnecessary, items such as unused or missing indexes should always be reviewed very carefully.
As an example on the results on my instance (WorldWideImporters) is to revise the unused index [FK_Sales_Customers_PostalCityID]. If I went ahead and dropped the index it would then be picked up by the assessment rule that checks for Foreign Keys with no index (FKNoIndexes).
If we wanted to remove data from the SQL table prior to exporting then a straightforward TRUNCATE TABLE can be used before repopulating the data: TRUNCATE TABLE [SQLAssessmentDemo].[Assessment].[Results] but as a Timestamp column is also exported we can easily build an automated weekly process (such as scheduling the PowerShell script to run via SQL Agent) that will store a complete repository over time of the assessment results which can be filtered on the date to track the history.
This is the big advantage of the PowerShell approach, it's really straightforward to implement an automated best practice assessment of your SQL Server instances and having the data in SQL Server means the creative types can build visualisations on top of the data for funky things like dashboards. Worth noting the table isn't indexed in any fashion by default so you'll need to look into that for better performing queries.
It's worth adding that there are plenty of third party options for SQL Server assessments out there that are both paid for and open source. Having the SQL Assessment API available is another option to take a look at, the underlying JSON is customisable too if you wanted to tweak some of the settings, disable rules or add your own and with the simplicity of PowerShell and the ability of running the assessment over multiple instances is a big advantage for me.