Dbatools to V1.0 and Beyond

Good News, Everyone!

Dbatools has finally reached critical mass to be released as version 1.0. This is a major step for the maintainers and developers of this module. It has been the goal for over 18months. The quick and dirty on it is this:

  • Cross Platform Support
  • MFA for Azure Support in cmdlets
  • New cmdlets for managing DBRoles
  • Change Log of every subversion shipped to the PowerShell Gallery
  • Digitally signed so cmdlets work in restricted environments
  • Optimized for importing
  • Over 550 cmdlets available for automating Microsoft SQL Instances
    • Nearly 5x the cmdlets available in the MS SQL Server Module 
  • The quality of the cmdlets has also gone up with 1000s of Integration tests
  • Parameters, Internal Code structure and Error handling have been Standardized

These are just the high points that stick out in my mind at this time. There are quite a few blogs being released today highlighting some of the other great features coming to V1.0 (Read More at https://dbatools.io/dbatools10)

If this is the first you are hearing about Dbatools download it now (Install-Module Dbatools) and give it a spin!

Personal Journey to V1.0 pro

For me personally, the journey for Dbatools reaching 1.0 has been a big part of my own professional growth.

Back in 2016 I started on a team that was understaffed and drowning in technical debt. Suddenly, I was handed the Dba responsibilities after a senior systems engineer retired. I did not know much about Databases, and was more of a PowerShell enthusiast. I ended up discovering Dbatools while trying to troubleshoot a network latency issue with our Data Architect. I then started to use it further when migrating from a 2012 AG to 2016 AG cluster. (Using this module actually helped inspire me to start this blog.)

In 2017 I attended PASS Summit trying to up my Dba skills, but found myself drawn to all of the PowerShell sessions. I just happened to see Aaron Nelson give a talk on Reporting Services w/ PowerShell and was trying to find him at lunch to ask a question. I sat down at his table, but instead of talking with him I ended up talking with Stuart Moore, Sander Stad, and ConstantineK, about how I was using dbatools and just happened to find out that these guys were some of the main contributors. (I also accidentally stalked them the rest of the week). Later that week I also met Chrissy LeMarie and Shawn Melton who both encouraged me to contribute (Thanks for doing this and being Awesome!). 

I did a few PRs in 2017, but finally in 2018, I started to try to do some more consistent PRs to the project. While I have never felt like a true Data Professional, I felt pretty confident in PowerShell, so most of my pull requests have been focused around the Integration Tests, and PSScript Analyzer rules. While writing tests is not glamorous, I honestly used it as an opportunity to learn various aspects of SQL Servers since most integration tests require you to get/set/update objects on an instance. So if I am writing a test to Copy a DB Mail object, I probably need to understand how to configure it using TSQL first. 

And Beyond!

Now that Dbatools is at Version 1.0, the expectation is that there will be more stable changes and notification of any major breaking change. There will also be more development for cloud native databases, continued support for cmdlets running on Linux, and whatever cmdlets the community needs.

I am personally going to continue working on integration tests to raise the overall code coverage of the cmdlets. I would also like to work on adding requested cmdlets that have been sitting in the queue for a few months, and maybe even some Linux specific cmdlets.

Conclusion

Not enough can be said for Chrissy, and the hard work she and the team have been doing in launching Dbatools into this next phase. The beauty of the SQL Collaborative is that they have been able to quickly react to the needs of the MS SQL community and have a promise to keep doing this in the years to come.

Kerberos Constrained Delegation with Group Managed Service Accounts

Background

One of the most painful troubleshooting experiences for me has been trying to figure out how to setup SQL Server Reporting Services (SSRS) to use Kerberos Constrained Delegation. This is where you try to execute a report using Data from a SQL Server Instance on a different computer. As a result you receive the unhelpful and annoying ‘NT Authority\ Anonymous Logon’ error whenever you try to run your report. The issue stems from the fact that the server running reports cannot pass your authentication to the data source

The problem that I always ran into with this issue is that I could never find the right blog, or guide to get through the entire setup without the guide deviating from my config or becoming unclear. Then to add to the complexity you need to edit Service Principal Names (SPN) on Active Directory (AD) computer objects, which means you need a Domain Admin if you are not one already. After a lot of struggling and understanding how this works and finally getting it to work on SSRS I felt I would never run into this problem again, until today.

I have a SQL 2016 Always On Availability Group cluster that needs a linked server to a SQL 2017 Server (a different but similar problem as the SSRS example above). The one variance from the normal Kerberos setup is that the 2016 cluster is using a Group Managed Service Account to run the SQL Service. This is a new type of domain controlled service account introduced in Server 2016. Instead of being a traditional AD user object (for which you may need to manually change the password from time to time) the gMSA is managed by AD and has it’s password changed automatically based on the domain policy. So unlike a standard user it does not have the delegation tab in Active Directory Users and Computers (ADUC) once an SPN has been set for it. (If you are interested in the processes of setting up gMSA with SQL I suggest this blog from Norm Eberly)

(Note: If you are having issues with setting up Kerberos for SSRS I highly suggest guy In A Cube’s video and write up on this )

Methodology

The steps we are doing follows most of the same guiding steps you would need for other Kerberos Constrained Delegation setup…

  • Ensure SPNs are configured on service account and Data Sources
  • Delegate Access on the service account to the Data Source
  • Test linked server (or other delegated service)

Checking SPNs

The Service Principal Name on a AD object helps advertise a service for a specific object in the network. When you install SQL on a computer and use the default service the HOST/ SPN is already setup on it and usually handles Kerberos authentication from one AD object to another. You should check each computer and service account used in your setup. To check the SPN of a Computer account or the gMSA use the following:

PS C:\Windows\System32> setspn -l gMSAsqlservice

For your servers this should return services like HOST/ServerName HTTP/ServerName WSMAN/ServerName MSSql… If the gMSA returns nothing for you will need to add the MSSQLSvc for each of the nodes in the AG Cluster. Use the following (Note: you will need Domain Admin access or be delegated permissions to execute this against an object):

PS C:\> setspn -a MSSQLSvc/Node1 gMSAsqlservice
PS C:\> setspn -a MSSQLSvc/Node1.domain.local gMSAsqlservice

If you are using a named instance then you can add :Name to the end. In some instances you may need to give the port number as well if there are multiple instances on one box.

PS C:> setspn -a MSSQLSvc/Node1:Name gMSAsqlservice
PS C:> setspn -a MSSQLSvc/Node1.domain.local:Name gMSAsqlservice
PS C:> setspn -a MSSQLSvc/Node1.domain.local:1433 gMSAsqlservice

Configure Delegation

In a typical setup with a standard AD User Object you could open ADUC and click the delegation tab, but in this case of a gMSA no delegation tab exists after this step. The key of this delegation tab is that you are marking which service (on which computer) the current service account is allowed to pass a users credentials to. While the specific tab does not exist if you open the Advanced Options on ADUC you can view the Extension Attributes. (Note: You may also set all of these settings via PowerShell when you create the account via New-ADServiceAccount , or by updating the account Set-ADServiceAccount)

First: locate the msDS-AllowedToDelegateTo

  • You need to make sure that you launch ADUC and have “Advanced Features” enabled.
    • This can be done by clicking the “view” > “Advanced Features.”
  • Navigate to the gMSA, Right Click, and select “Properties”
  • Select the “Attribute Editor” tab
  • Navigate to “msDS-AllowedToDelegateTo” Attribute
  • Click “Edit”
  • Type the Service Name of the Data Source you want your Linked Server to pull data from and click “Add” and “OK”
    • Remember that the gMSA is running the SQL Service you are setting the Linked Server up on.

Second: Set the userAccountControl

  • Still looking at the “Attribute Editor” tab on the gMSA account
  • Navigate down to the “userAccountControl” attribute
  • Click “Edit”
  • Replace the Value with 16781312
    • This is not in the official list, but this value was discovered via configuring a standard AD User Object.
    • Edit: the value comes from combining the userAccountControl values of WORKSTATION_TRUST_ACCOUNT (4096) and TRUSTED_TO_AUTH_FOR_DELEGATION (16777216)
  • Click “OK” and close all Dialogues

Testing LinkServer

You may need to restart the SQL Service before creating your Linked Server, but you should be able to create it after this and execute a query against it. If you are still receiving errors, you may need to make sure that you have the correct SPNs on the correct Service Account, and that you have permissions on the destination. It may also help to have both the NetBIOS name and FQDN of the destination server as an SPN.

Conclusion

This go around of figuring out Kerberos Constrained Delegation was not as difficult as my first go around (which lasted about 6 months of on and off troubleshooting), but it is nothing I would wish on anyone. The benefits of this if you get it right are the key – it allows you to setup a linked server (or even SSRS) with the constraints of AD credentials. This promotes least privileged access on your SQL Instance where you may have a shared environment and not everyone should have access to the remote system.

Copy-SqlLinkedServer

Note: This is the last of my series on dbatools.io and upon publishing it I discovered that dbatools is changing their prefix to be Verb-DBAnoun. Read about it here https://dbatools.io/bagofbobbish-beta/

In the series on dbatools module (https://dbatools.io) we at are going to look next at the Copy-SqlLinkedServer

Something that may not be as widely used by some organizations, but was widely used on one of my servers are linked servers. These are usually a work around for some other Oracle DB or even a means of connecting to a reporting data source. These connections can be complicated as they sometimes have local credentials stored in the connection and have no means of retrieving the password. By using this command in the following way you may copy the linked server to another SQL instance with ease.

Copy-SQLLinkedServer -Source SQL2012 -Destination SQL2016 -LinkedServer OracleDB2

Any settings that you need for the linked server (RPC OUT, Data Access, etc) should be turned on before moving the linked server with this method as it only copies the name and login (if present) and inherits all other settings from the destination instance.

 

Copy-SqlDatabaseMail

In the series on dbatools module (https://dbatools.io) we at are going to look next at the Copy-SqlDatabaseMail

If you are not in a particularly large shop (or a well funded one) you may not have the most robust monitoring solution, but you may at least have an open Email relay for you use. Setting up Email alerts on an Sql instance are very important so that you may receive notifications of job failures, or completions of integration service jobs. Additionally you may have triggered jobs which notify you of specific database fill. Once you configure these settings it may be a pain to recreate these on another instance especially if you are not the email admin.

By using this syntax you can easily migrate the settings

Copy-SQLDatabaseMail -Source SQL2012 -Destination SQL2016

After migrating the settings you may still need to enable the mail services on the specific instance you are migrating towards (right click agent > Alert System > Enable Mail Profile), but once you do this you will have all the settings in place for your alerts. Now you may also want to use the Copy-SqlOperator if you have specific Email addresses setup on your Jobs. This is absolutely a pre-requisite before you migrate specific jobs from another instance.

 

Copy-SqlJob

In the last post on the dbatools module (https://dbatools.io) we talked about the Copy-SQLCredential. Using this previous cmdlet may need to be a prerequisite before using Copy-SQLJob.

The Copy-SQLJob will move SQL Agent jobs, including the schedule from one instance to another instance. This can be highly efficient if moving an instance from one server to another server. The only downside of this cmdlet is that it will copy the job with the exact same object id, which means that if you are trying to use this to duplicate jobs as a template will cause them to be linked on a single instance. This means that if you change a setting on one it will affect the other jobs copied from the same instance.

The syntax is as follows

Copy-SqlJob -Source SQL2012 -Destination SQL2016 -jobs NightlyMaintenance,HourlyTransactionBackups

Copy-SqlCredential

In the series on dbatools module (https://dbatools.io) we at are going to look next at the Copy-SqlCredential.

As you may or may not know, if a DBA is going to keep his database up to shape and continue to process ETL (Extract Transform and Load) jobs they need to leverage automation. One method that is provided by SQL is the Agent, which may have jobs that leverage all sorts of items (SQL, SSIS packages, even Batch and PowerShell Scripts). Most of these jobs can run as the service account running the agent, but in some cases (Batch and SSIS among others) a Credential and Proxy are needed to run these jobs.

In migrating from one server to another, the DBA may not have access to the passwords for the Credentials that exist to run certain packages. I found that the Copy-SQLCredential to be helpful for this. In addition, this can be leveraged in the following way with Copy-SQLProxyAccount.

Copy-SQLCredential -Source SQL2012 -Destination SQL2016 -Credential 'Domain\CredentialedAccount'
 
Copy-SQLProxyAccount -Source SQL2012 -Destination SQL2016 -Credential 'Proxy_CredentialedAccount'

 

Copy-SQLlogin

As I mentioned in my last post, I am going to be spending some time going over a couple of my favorite tools from the dbatools (https://dabtool.io) PowerShell module.

The one cmdlet that actually led me to the module in the first place, which has been a tried and true tool for me, is the “Copy-SQLlogin.” I am a relatively new to the work of a DBA, but one of the most difficult lessons I learned early on is that if you have a SQL Always On Availability Group (AG) setup the you have to maintain a copy of the Logins on all replicas where you need to potentially access the data from in a read capacity. From what I have read, to keep these groups in sync you would need to either grab the Login, “Create to Script” on the other replica or use partial containment to get around this problem.

This cmdlet however, offers a method similar to the first one, but with the options of syncing all of the Logins or only select Logins

My preferred method of using this command would be in the following syntax

1
  Copy-SQLLogin -Source SQL2012 -Destination SQL2016 -Logins 'Domain\ServiceAccount','LocalSQLLogin'

This provides you the flexibility to copy the Logins for the databases that you may be moving. After doing this from the original source I would then switch the source and destination for the primary and any secondary replica in the cluster. One caviat that I found to this is that if you have another AG on the secondary that is not a readable secondary or is read-intent only this cmdlet may fail or provide a warning that not all the databases were available.

Another robust version of this cmdlet is just using it with the SyncOnly

1
  Copy-SQLLogin -Source SQL2012 -Destination SQL2016 -synconly

This does not add or drop the user, but merely syncs up the objects securables. For Instances that are 2012 and newer this cmdlet can be used to move the SIDS, passwords, and other items. It makes it a powerful addition to any DBA’s tool belt.

dbatools

Recently I found myself responsible for a large database migration and knew I would need to make sure that the environment had the exact same setup as the previous systems and was setup and migrated as fast and as accurately as possible. As a Result I discovered dbatools (https://dbatools.io) which is a PowerShell module that provides some neat cmdlets for SQL Administration. This module can be installed on any recent version of PowerShell (v 4 and v5) from the PowerShell gallery via Find-module and Install-module.

The most helpful of the over 150 cmdlets are the following

  1. Copy-SqlLogin
  2. Copy-SqlCredential
  3. Copy-SqlJob
  4. Copy-SqlDatabaseMail
  5. Copy-SqlLinkedServer

In some of the following posts I will share how some of these are used.