Dbatools to V1.0 and Beyond

Good News, Everyone!

Dbatools has finally reached critical mass to be released as version 1.0. This is a major step for the maintainers and developers of this module. It has been the goal for over 18months. The quick and dirty on it is this:

  • Cross Platform Support
  • MFA for Azure Support in cmdlets
  • New cmdlets for managing DBRoles
  • Change Log of every subversion shipped to the PowerShell Gallery
  • Digitally signed so cmdlets work in restricted environments
  • Optimized for importing
  • Over 550 cmdlets available for automating Microsoft SQL Instances
    • Nearly 5x the cmdlets available in the MS SQL Server Module 
  • The quality of the cmdlets has also gone up with 1000s of Integration tests
  • Parameters, Internal Code structure and Error handling have been Standardized

These are just the high points that stick out in my mind at this time. There are quite a few blogs being released today highlighting some of the other great features coming to V1.0 (Read More at https://dbatools.io/dbatools10)

If this is the first you are hearing about Dbatools download it now (Install-Module Dbatools) and give it a spin!

Personal Journey to V1.0 pro

For me personally, the journey for Dbatools reaching 1.0 has been a big part of my own professional growth.

Back in 2016 I started on a team that was understaffed and drowning in technical debt. Suddenly, I was handed the Dba responsibilities after a senior systems engineer retired. I did not know much about Databases, and was more of a PowerShell enthusiast. I ended up discovering Dbatools while trying to troubleshoot a network latency issue with our Data Architect. I then started to use it further when migrating from a 2012 AG to 2016 AG cluster. (Using this module actually helped inspire me to start this blog.)

In 2017 I attended PASS Summit trying to up my Dba skills, but found myself drawn to all of the PowerShell sessions. I just happened to see Aaron Nelson give a talk on Reporting Services w/ PowerShell and was trying to find him at lunch to ask a question. I sat down at his table, but instead of talking with him I ended up talking with Stuart Moore, Sander Stad, and ConstantineK, about how I was using dbatools and just happened to find out that these guys were some of the main contributors. (I also accidentally stalked them the rest of the week). Later that week I also met Chrissy LeMarie and Shawn Melton who both encouraged me to contribute (Thanks for doing this and being Awesome!). 

I did a few PRs in 2017, but finally in 2018, I started to try to do some more consistent PRs to the project. While I have never felt like a true Data Professional, I felt pretty confident in PowerShell, so most of my pull requests have been focused around the Integration Tests, and PSScript Analyzer rules. While writing tests is not glamorous, I honestly used it as an opportunity to learn various aspects of SQL Servers since most integration tests require you to get/set/update objects on an instance. So if I am writing a test to Copy a DB Mail object, I probably need to understand how to configure it using TSQL first. 

And Beyond!

Now that Dbatools is at Version 1.0, the expectation is that there will be more stable changes and notification of any major breaking change. There will also be more development for cloud native databases, continued support for cmdlets running on Linux, and whatever cmdlets the community needs.

I am personally going to continue working on integration tests to raise the overall code coverage of the cmdlets. I would also like to work on adding requested cmdlets that have been sitting in the queue for a few months, and maybe even some Linux specific cmdlets.


Not enough can be said for Chrissy, and the hard work she and the team have been doing in launching Dbatools into this next phase. The beauty of the SQL Collaborative is that they have been able to quickly react to the needs of the MS SQL community and have a promise to keep doing this in the years to come.

Checking Your Backup Strategy with DBAChecks

Within the last 18 months I have become an accidental DBA and found myself grasping for information and tools in order to help manage the handful of database servers for which I am responsible. In that journey I discovered dbatools dbatools.io which is a great community-built PowerShell module for configuring SQL Instances. If you’re not familiar with them, they have cmdlets that include the likes of Copy-DbaLogin, Find-DbaBackup, Get-DbaMemoryUsage, and Test-DbaPowerPlan. The last of these cmdlets I just listed really encapsulates the heart of the dbatools team as they are seeking to provide tools that reflect the best practices for data professionals. (If you don’t know that the Power Plan improperly set can cause performance issues, you should probably download dbatools and give it a shot).

Continuing this desire to help the community practice good things, the dbatools team has now released dbachecks dbachecks.io. This is a module which leverages pester (a powershell module for running unit testing) to run tests against your sql instances. You can find full details on what is required and how to install this module on their GitHub page here.

There are a large number of tests included with the dbachecks module, but you can create your own if you are familiar with writing tests in pester. A few of the built in tests that I found most helpful (and actually made me re-evaluate my production systems) were the tests for Backups.

After loading the module you can check out all of the predefined tests by running Get-DbcCheck. In order to get a list of all of the tests that check backups, use the -Pattern parameter to search for tests with matching keywords.

Get-DbcCheck -Pattern backup

In order to run the test you execute Invoke-DbcCheck with the following syntax to return successful passes of tests and failures.

Invoke-DbcCheck -Check BackupPathAccess,DefaultBackupCompression -SqlInstance localhost -ComputerName localhost

As you can see the backup location is accessible, and I have not configured compression for my backup sets. It also only took less than a half a second to run these checks. Now one of the most commonly unchecked aspects of databases backup is it will restore, and conveniently enough dbachecks has a test for that as well. Under the hood, this test is using the Test-DbaLastBackup to try restoring the database with the flag -VerifyOnly only:

Invoke-DbcCheck -Check TestLastBackupVerifyOnly -SqlInstnace localhost -ComputerName localhost

The user databases for my SSRS instances were successfully tested, but there were no backups for my system databases. As a result I decided to double check and everything was in order, but you can never be sure.

So my suggestion is to set this as a scheduled task, or agent job and make sure that your backups are up to date. This module makes it so easy to make sure that you don’t lose your job because of data loss. This is just a highlight of a few of theses tests. Imagine the rest of the checks and tests that are possible with this module whether they are built in or they are one of your own creation. Download the module from the PowerShell Gallery and give it a try.




Note: This is the last of my series on dbatools.io and upon publishing it I discovered that dbatools is changing their prefix to be Verb-DBAnoun. Read about it here https://dbatools.io/bagofbobbish-beta/

In the series on dbatools module (https://dbatools.io) we at are going to look next at the Copy-SqlLinkedServer

Something that may not be as widely used by some organizations, but was widely used on one of my servers are linked servers. These are usually a work around for some other Oracle DB or even a means of connecting to a reporting data source. These connections can be complicated as they sometimes have local credentials stored in the connection and have no means of retrieving the password. By using this command in the following way you may copy the linked server to another SQL instance with ease.

Copy-SQLLinkedServer -Source SQL2012 -Destination SQL2016 -LinkedServer OracleDB2

Any settings that you need for the linked server (RPC OUT, Data Access, etc) should be turned on before moving the linked server with this method as it only copies the name and login (if present) and inherits all other settings from the destination instance.



In the series on dbatools module (https://dbatools.io) we at are going to look next at the Copy-SqlDatabaseMail

If you are not in a particularly large shop (or a well funded one) you may not have the most robust monitoring solution, but you may at least have an open Email relay for you use. Setting up Email alerts on an Sql instance are very important so that you may receive notifications of job failures, or completions of integration service jobs. Additionally you may have triggered jobs which notify you of specific database fill. Once you configure these settings it may be a pain to recreate these on another instance especially if you are not the email admin.

By using this syntax you can easily migrate the settings

Copy-SQLDatabaseMail -Source SQL2012 -Destination SQL2016

After migrating the settings you may still need to enable the mail services on the specific instance you are migrating towards (right click agent > Alert System > Enable Mail Profile), but once you do this you will have all the settings in place for your alerts. Now you may also want to use the Copy-SqlOperator if you have specific Email addresses setup on your Jobs. This is absolutely a pre-requisite before you migrate specific jobs from another instance.



In the last post on the dbatools module (https://dbatools.io) we talked about the Copy-SQLCredential. Using this previous cmdlet may need to be a prerequisite before using Copy-SQLJob.

The Copy-SQLJob will move SQL Agent jobs, including the schedule from one instance to another instance. This can be highly efficient if moving an instance from one server to another server. The only downside of this cmdlet is that it will copy the job with the exact same object id, which means that if you are trying to use this to duplicate jobs as a template will cause them to be linked on a single instance. This means that if you change a setting on one it will affect the other jobs copied from the same instance.

The syntax is as follows

Copy-SqlJob -Source SQL2012 -Destination SQL2016 -jobs NightlyMaintenance,HourlyTransactionBackups


In the series on dbatools module (https://dbatools.io) we at are going to look next at the Copy-SqlCredential.

As you may or may not know, if a DBA is going to keep his database up to shape and continue to process ETL (Extract Transform and Load) jobs they need to leverage automation. One method that is provided by SQL is the Agent, which may have jobs that leverage all sorts of items (SQL, SSIS packages, even Batch and PowerShell Scripts). Most of these jobs can run as the service account running the agent, but in some cases (Batch and SSIS among others) a Credential and Proxy are needed to run these jobs.

In migrating from one server to another, the DBA may not have access to the passwords for the Credentials that exist to run certain packages. I found that the Copy-SQLCredential to be helpful for this. In addition, this can be leveraged in the following way with Copy-SQLProxyAccount.

Copy-SQLCredential -Source SQL2012 -Destination SQL2016 -Credential 'Domain\CredentialedAccount'
Copy-SQLProxyAccount -Source SQL2012 -Destination SQL2016 -Credential 'Proxy_CredentialedAccount'



As I mentioned in my last post, I am going to be spending some time going over a couple of my favorite tools from the dbatools (https://dabtool.io) PowerShell module.

The one cmdlet that actually led me to the module in the first place, which has been a tried and true tool for me, is the “Copy-SQLlogin.” I am a relatively new to the work of a DBA, but one of the most difficult lessons I learned early on is that if you have a SQL Always On Availability Group (AG) setup the you have to maintain a copy of the Logins on all replicas where you need to potentially access the data from in a read capacity. From what I have read, to keep these groups in sync you would need to either grab the Login, “Create to Script” on the other replica or use partial containment to get around this problem.

This cmdlet however, offers a method similar to the first one, but with the options of syncing all of the Logins or only select Logins

My preferred method of using this command would be in the following syntax

  Copy-SQLLogin -Source SQL2012 -Destination SQL2016 -Logins 'Domain\ServiceAccount','LocalSQLLogin'

This provides you the flexibility to copy the Logins for the databases that you may be moving. After doing this from the original source I would then switch the source and destination for the primary and any secondary replica in the cluster. One caviat that I found to this is that if you have another AG on the secondary that is not a readable secondary or is read-intent only this cmdlet may fail or provide a warning that not all the databases were available.

Another robust version of this cmdlet is just using it with the SyncOnly

  Copy-SQLLogin -Source SQL2012 -Destination SQL2016 -synconly

This does not add or drop the user, but merely syncs up the objects securables. For Instances that are 2012 and newer this cmdlet can be used to move the SIDS, passwords, and other items. It makes it a powerful addition to any DBA’s tool belt.


Recently I found myself responsible for a large database migration and knew I would need to make sure that the environment had the exact same setup as the previous systems and was setup and migrated as fast and as accurately as possible. As a Result I discovered dbatools (https://dbatools.io) which is a PowerShell module that provides some neat cmdlets for SQL Administration. This module can be installed on any recent version of PowerShell (v 4 and v5) from the PowerShell gallery via Find-module and Install-module.

The most helpful of the over 150 cmdlets are the following

  1. Copy-SqlLogin
  2. Copy-SqlCredential
  3. Copy-SqlJob
  4. Copy-SqlDatabaseMail
  5. Copy-SqlLinkedServer

In some of the following posts I will share how some of these are used.