Dbatools to V1.0 and Beyond

Good News, Everyone!

Dbatools has finally reached critical mass to be released as version 1.0. This is a major step for the maintainers and developers of this module. It has been the goal for over 18months. The quick and dirty on it is this:

  • Cross Platform Support
  • MFA for Azure Support in cmdlets
  • New cmdlets for managing DBRoles
  • Change Log of every subversion shipped to the PowerShell Gallery
  • Digitally signed so cmdlets work in restricted environments
  • Optimized for importing
  • Over 550 cmdlets available for automating Microsoft SQL Instances
    • Nearly 5x the cmdlets available in the MS SQL Server Module 
  • The quality of the cmdlets has also gone up with 1000s of Integration tests
  • Parameters, Internal Code structure and Error handling have been Standardized

These are just the high points that stick out in my mind at this time. There are quite a few blogs being released today highlighting some of the other great features coming to V1.0 (Read More at https://dbatools.io/dbatools10)

If this is the first you are hearing about Dbatools download it now (Install-Module Dbatools) and give it a spin!

Personal Journey to V1.0 pro

For me personally, the journey for Dbatools reaching 1.0 has been a big part of my own professional growth.

Back in 2016 I started on a team that was understaffed and drowning in technical debt. Suddenly, I was handed the Dba responsibilities after a senior systems engineer retired. I did not know much about Databases, and was more of a PowerShell enthusiast. I ended up discovering Dbatools while trying to troubleshoot a network latency issue with our Data Architect. I then started to use it further when migrating from a 2012 AG to 2016 AG cluster. (Using this module actually helped inspire me to start this blog.)

In 2017 I attended PASS Summit trying to up my Dba skills, but found myself drawn to all of the PowerShell sessions. I just happened to see Aaron Nelson give a talk on Reporting Services w/ PowerShell and was trying to find him at lunch to ask a question. I sat down at his table, but instead of talking with him I ended up talking with Stuart Moore, Sander Stad, and ConstantineK, about how I was using dbatools and just happened to find out that these guys were some of the main contributors. (I also accidentally stalked them the rest of the week). Later that week I also met Chrissy LeMarie and Shawn Melton who both encouraged me to contribute (Thanks for doing this and being Awesome!). 

I did a few PRs in 2017, but finally in 2018, I started to try to do some more consistent PRs to the project. While I have never felt like a true Data Professional, I felt pretty confident in PowerShell, so most of my pull requests have been focused around the Integration Tests, and PSScript Analyzer rules. While writing tests is not glamorous, I honestly used it as an opportunity to learn various aspects of SQL Servers since most integration tests require you to get/set/update objects on an instance. So if I am writing a test to Copy a DB Mail object, I probably need to understand how to configure it using TSQL first. 

And Beyond!

Now that Dbatools is at Version 1.0, the expectation is that there will be more stable changes and notification of any major breaking change. There will also be more development for cloud native databases, continued support for cmdlets running on Linux, and whatever cmdlets the community needs.

I am personally going to continue working on integration tests to raise the overall code coverage of the cmdlets. I would also like to work on adding requested cmdlets that have been sitting in the queue for a few months, and maybe even some Linux specific cmdlets.


Not enough can be said for Chrissy, and the hard work she and the team have been doing in launching Dbatools into this next phase. The beauty of the SQL Collaborative is that they have been able to quickly react to the needs of the MS SQL community and have a promise to keep doing this in the years to come.

What not to do when Connecting Azure DevOps to AzureAD


Visual Studio Team Services…. ummm I mean Azure DevOps (…long live VSTS) is a really amazing offer from Microsoft. It is a great place for Developers and Operations people to work together, store code, run CI/CD pipelines, and keep track of projects. It is also free for the first 5 basic users (more if you are a MSDN Enterprise/Professional License), hosts private repositories, allows pipelines in azure, etc. I started using AzDO about a year ago when I was tired of maintaining my own GitLab environment, and our Lead Dev was tired of TFS. We also had both suffered from a lack of documentation and consistent code tracking. We both agreed it would be better if we shared a version control system and not repeated the missteps of our predecessors.

If you haven’t used Azure DevOps I suggest you check it out.

When we started using DevOps, we had not fully migrated to Azure AD as an organization. At the time the only way we could use our MSDN subscription or DevOps was by using a Microsoft account that we personally had setup. Some of us had opted for email accounts we already had (<username>@outlook.com) and others to mimic organization ids (<username>@contoso.com). This worked fine, but as we started on-boarding more Devs, we thought it might be best to pivot to using our Azure AD accounts. This decision was to make the user life-cycle a little easier to manage (…”oh hey X still has access to our code.”), plus we wanted to integrate notifications into some of our MS Teams.

So this week we went to under take the change of adding our Azure DevOps Organization to Azure AD, and it did not go so well. Don’t do what we did..


Connect Azure DevOps to an AzureAD Tenant used by my organization.

  • Setup a Microsoft account in your DevOps Organization
  • Connect the Tenant
  • Test Logins

Setup a Microsoft Account

Before I started down this path of connecting these two services I briefly read Microsoft’s Documentation on the process (Connecting & Disconnecting). I would urge you to read both closely before doing this change. Also you will need to alert your users of the change and possibly switch their MSDN license to use their AzureAD account.

The basic outline for the Azure DevOps change as I understood it was:

  1. Get a Microsoft Account (MSA)
  2. Make it a Project Collection Administrator*
  3. Make sure it is not the same name as an existing AzureAD identity

This is going to be easy…

As I said earlier, we were already using MSA accounts in our Azure DevOps organization and my account already had the rights (Step 1 + 2 are done!). The one part I was concerned about was that I had named my MSA the same as my friendly email alias in our AzureAD tenant; however, this did not turn out to be an issue since my UserPrincipalName is different than my Alias so I could add both to the DevOps organization without any issues.

* Looking back, I should have created a new MSA account specifically for this change. It would have made me pay more attention to the article. The MSA account I use already had the proper permissions in Azure DevOps, but it was not a Guest in our Azure AD tenant, which was a part of Step 2. Interestingly, if you use <username>@contoso.com for your MSA it doesn’t allow you to add that user to your AzureAD as a guest. So you would need to use some other external domain for this MSA.

By default all MSAs create their own AzureAD Directory (usually something like <usernameoutlook>.onmicrosoft.com, and once you add them to your directory they receive that as an additional directory the account can use. The one I was using had its own default Directory, but it also had Microsoft’s Directory. I believe the reason this happened is either an interactive lab I did back at Ignite, or when I did a lesson on Microsoft.com/learn for Azure. (They allow you to spin up resources to provide interactive learning)

To figure out which Directory your account has:
login to portal.azure.com
click on your name in the right corner
click "switch directory"

Connect the Tenant

We began our move earlier in the week by switching all of our MSDN users to their AzureAD account. This worked for most users, and they were able to login to MSDN using their AzureAD account (although it took almost a day for a few users) and DevOps using both their MSA and AzureAD accounts. They could use both in DevOps because we had added the new account to DevOps in preparation of the change to make sure they got the same permissions.

The last user who was having trouble kept getting a 401 for their AzureAD account. The error kept saying not found in Directory. We tried a little bit of this and that, but ultimately we decided let’s just flip it on.

If we connect it to our Directory then he will be able to login. What is the worst that could happen…

I logged in with my MSA to Portal.Azure.com and navigated to the Azure DevOps Organization blade and clicked “Connect AAD.” Asked me one more time if I was sure I wanted to do this, and away we went.

It took a few minutes, but then it told me that it had connected the Service to Directory 72f988bf-86f1-41af-91ab-2d7cd011db47….

Wait that isn’t our directory. I connected it to the wrong one…

Wrong AzureAD Directory

Test Logins

By this point in the day, after the feeling of despair set in, things weren’t looking so awesome. We immediately found out that the one person we were trying to fix could not login (because it was the wrong Tenant). Neither could anyone else in our DevOps organization. Neither their MSA or AzureAD accounts worked except for the account with which I had bound everything. (We later found out that all our existing PAT and SSH keys worked, even though we could not login. This ended up being a saving grace as our Development team was able to pull down the critical repos they were working on).

We had read the disconnect instructions, and tried to reverse the change, but in order to do this you need to have a Microsoft account as the owner, and it needs to be a Global Admin . Fun fact: I am not (and may never be at this point) a Global Admin in Microsoft’s Tenant. So we sent a support message to Microsoft. It took a few days of tense waiting (mostly because of our support tier), but we eventually had them switch the directory identity to our organization. This resolved everyone’s login issues and got us back to the desired state.

“aww snapdizzle. I’m in. What is this magical world!”

– An Unnamed Dev after 2 days without DevOps


Read the Manual. Closely. I usually read a lot and plan a lot before making a change like this, but by missing one important detail (MSA must be a guest of the joining Tenant) it caused days of chaos for our devs. Make a new MSA when joining to your Organization. You can always throw it away after the switch is made. Also make sure you’re paying for the right level of support. There were several times where we almost switched to Primer (and we may still do this in the future)

Kerberos Constrained Delegation with Group Managed Service Accounts


One of the most painful troubleshooting experiences for me has been trying to figure out how to setup SQL Server Reporting Services (SSRS) to use Kerberos Constrained Delegation. This is where you try to execute a report using Data from a SQL Server Instance on a different computer. As a result you receive the unhelpful and annoying ‘NT Authority\ Anonymous Logon’ error whenever you try to run your report. The issue stems from the fact that the server running reports cannot pass your authentication to the data source

The problem that I always ran into with this issue is that I could never find the right blog, or guide to get through the entire setup without the guide deviating from my config or becoming unclear. Then to add to the complexity you need to edit Service Principal Names (SPN) on Active Directory (AD) computer objects, which means you need a Domain Admin if you are not one already. After a lot of struggling and understanding how this works and finally getting it to work on SSRS I felt I would never run into this problem again, until today.

I have a SQL 2016 Always On Availability Group cluster that needs a linked server to a SQL 2017 Server (a different but similar problem as the SSRS example above). The one variance from the normal Kerberos setup is that the 2016 cluster is using a Group Managed Service Account to run the SQL Service. This is a new type of domain controlled service account introduced in Server 2016. Instead of being a traditional AD user object (for which you may need to manually change the password from time to time) the gMSA is managed by AD and has it’s password changed automatically based on the domain policy. So unlike a standard user it does not have the delegation tab in Active Directory Users and Computers (ADUC) once an SPN has been set for it. (If you are interested in the processes of setting up gMSA with SQL I suggest this blog from Norm Eberly)

(Note: If you are having issues with setting up Kerberos for SSRS I highly suggest guy In A Cube’s video and write up on this )


The steps we are doing follows most of the same guiding steps you would need for other Kerberos Constrained Delegation setup…

  • Ensure SPNs are configured on service account and Data Sources
  • Delegate Access on the service account to the Data Source
  • Test linked server (or other delegated service)

Checking SPNs

The Service Principal Name on a AD object helps advertise a service for a specific object in the network. When you install SQL on a computer and use the default service the HOST/ SPN is already setup on it and usually handles Kerberos authentication from one AD object to another. You should check each computer and service account used in your setup. To check the SPN of a Computer account or the gMSA use the following:

PS C:\Windows\System32> setspn -l gMSAsqlservice

For your servers this should return services like HOST/ServerName HTTP/ServerName WSMAN/ServerName MSSql… If the gMSA returns nothing for you will need to add the MSSQLSvc for each of the nodes in the AG Cluster. Use the following (Note: you will need Domain Admin access or be delegated permissions to execute this against an object):

PS C:\> setspn -a MSSQLSvc/Node1 gMSAsqlservice
PS C:\> setspn -a MSSQLSvc/Node1.domain.local gMSAsqlservice

If you are using a named instance then you can add :Name to the end. In some instances you may need to give the port number as well if there are multiple instances on one box.

PS C:> setspn -a MSSQLSvc/Node1:Name gMSAsqlservice
PS C:> setspn -a MSSQLSvc/Node1.domain.local:Name gMSAsqlservice
PS C:> setspn -a MSSQLSvc/Node1.domain.local:1433 gMSAsqlservice

Configure Delegation

In a typical setup with a standard AD User Object you could open ADUC and click the delegation tab, but in this case of a gMSA no delegation tab exists after this step. The key of this delegation tab is that you are marking which service (on which computer) the current service account is allowed to pass a users credentials to. While the specific tab does not exist if you open the Advanced Options on ADUC you can view the Extension Attributes. (Note: You may also set all of these settings via PowerShell when you create the account via New-ADServiceAccount , or by updating the account Set-ADServiceAccount)

First: locate the msDS-AllowedToDelegateTo

  • You need to make sure that you launch ADUC and have “Advanced Features” enabled.
    • This can be done by clicking the “view” > “Advanced Features.”
  • Navigate to the gMSA, Right Click, and select “Properties”
  • Select the “Attribute Editor” tab
  • Navigate to “msDS-AllowedToDelegateTo” Attribute
  • Click “Edit”
  • Type the Service Name of the Data Source you want your Linked Server to pull data from and click “Add” and “OK”
    • Remember that the gMSA is running the SQL Service you are setting the Linked Server up on.

Second: Set the userAccountControl

  • Still looking at the “Attribute Editor” tab on the gMSA account
  • Navigate down to the “userAccountControl” attribute
  • Click “Edit”
  • Replace the Value with 16781312
    • This is not in the official list, but this value was discovered via configuring a standard AD User Object.
    • Edit: the value comes from combining the userAccountControl values of WORKSTATION_TRUST_ACCOUNT (4096) and TRUSTED_TO_AUTH_FOR_DELEGATION (16777216)
  • Click “OK” and close all Dialogues

Testing LinkServer

You may need to restart the SQL Service before creating your Linked Server, but you should be able to create it after this and execute a query against it. If you are still receiving errors, you may need to make sure that you have the correct SPNs on the correct Service Account, and that you have permissions on the destination. It may also help to have both the NetBIOS name and FQDN of the destination server as an SPN.


This go around of figuring out Kerberos Constrained Delegation was not as difficult as my first go around (which lasted about 6 months of on and off troubleshooting), but it is nothing I would wish on anyone. The benefits of this if you get it right are the key – it allows you to setup a linked server (or even SSRS) with the constraints of AD credentials. This promotes least privileged access on your SQL Instance where you may have a shared environment and not everyone should have access to the remote system.

Creating Word Documents with PowerShell


For the last few months I have been working on a course called “Introduction to Windows PowerShell” for Pragmatic Works. This has been a challenging and rewarding task as I have found that you learn something best when you teach it to someone else.

Part of the deliverables for this course was providing labs for students to walk through on their own. Also for each lab, there is a solution that I provide in case they get stuck. On the first pass I had placed all of these labs and solutions in a separate .ps1 file, but after further review it was going to be better if they were all in a single Word Document. This however meant combining roughly 20 files into 2 Documents.

Being somewhat crunched for time, I decided to see if I could automate feeding these .ps1 files into a Word Document using PowerShell.


My gut response for the last few years has been to automate  as many things as possible using PowerShell. If this could not be done entirely with PowerShell, at least I would try and do as much as possible.

In order to do this I had to learn a little bit:

  • Are there any CLI options for Word?
  • If not, are there any .NET Classes exposed for Word?
  • And if either of these first two were true could  I produce a new script?

.NET Classes for Microsoft Word

Upon doing some digging I found a couple forum posts on manipulating files using PowerShell (StackOverflow, MSDN ). What caught my eye from these posts is that they both used ‘Word.Application’ as a COM object. Looking into this, led me to information about Word’s VBA interface for scripting (MSDN – VBA). The VBA scripting  wasn’t exactly what I was looking for, but I figured I would keep digging. Finally I found the gold at the end of the rainbow. The entire .NET namespace documentation for Office 2013+  (MSDN – Namespace). There is a lot that this documentation covers, but I will just walk through the basics of creating a file, formatting the text, adding text from other files, and then finally saving the document.

Creating a File

In order to create a file you must have an instance of Word in your PowerShell Session. In order to do this you can use the New-Object cmdlet and use the -ComObject parameter as shown below:

PS C:\WINDOWS\system32> $WordApplication = New-Object -ComObject "word.application"

This starts a instance of Word, which you can view in processes, but does not open a visual window. If you want to see the window you can do the following:

PS C:\WINDOWS\system32> $WordApplication.Visible = $true

If you look at Word, it is now visible, but no document is open. We must now create the document we want to use. We could also open up a document if we had an existing one.

Microsoft Word

Off of the $WordApplication variable we can create a new document. It is best to save this new Document into another variable for future use.

 PS C:\WINDOWS\system32> $Document = $WordApplication.Documents.Add()

At this point you now have a document to work with.

Writing the Document

By this point I spent a lot of time trying and failing. Since there was both a variable for the Application ( $WordApplication ) and the Document ( $Document ), it was difficult determining which one I needed to writing from my sources. I first assumed it was the Document variable because that is what I was editing, but all of the interfaces were through the ‘Selection’ interface on the Application variable.

PS C:\WINDOWS\system32> $Writer = $WordApplication.Selection

There are several things I wanted to do inside of this document. First I wanted to bring content in from various files, but I also wanted to create a heading for each file that was written. The first part seemed easy, because there exists a method called .TypeText() . I found that you can insert text by writing a string or calling another cmdlet like Get-Content inside of this method. Also I found that you could set the style of the document before you ‘typed’ and then change it back to the normal format. The most confusing part was the inconsistency in the methods. When calling $Writer.TypeText((Get-Content -path $blah)) you have parentheses around the value, but when changing the style you assigned a value like this $Writer.Style = 'Heading 1'.

In the end, I wrote a foreach loop in the following way to import all of the text while setting the style on import.

 PS C:\WINDOWS\system32> 1..10 | foreach -Process {
>> $labpath = "C:\PragmaticWorks\Introduction to Windows PowerShell\Module$_\Lab-Solution.ps1"
>> $Writer.Style = 'Heading 1'
>> $Writer.TypeText("Module $_ Lab Solution")
>> $Writer.TypeParagraph()
>> $Writer.Style = 'Normal'
>> $Writer.TypeText((Get-Content $labpath))
>> $Writer.TypeParagraph()
>> }

note: .TypeParagraph() inserts a new line so that I would have a break between text.

When I finished inserting all of my files I saved the file and closed the document.

PS C:\WINDOWS\system32> $Document.SaveAs($filepath)
PS C:\WINDOWS\system32> $Document.Close()


This was a fun conceptual project to see if it was possible, and it did end up saving me the pain of copy and pasting all 20 documents into 2 separate documents. I did have some issues with formatting, which I would have loved to explore further, but I am not sure I could have automated this formatting as I had code examples in my source documents. I ended up manually aligning these code examples once in word. I did notice a .Find() interface on the Application variable and this would be neat to explore deeper for gathering information from Word Documents on distributed systems.

An MVP I follow suggested that I make this into a module. What are your thoughts? Would you like to see a PowerShell module for interfacing with documents? Feel free to tweet me @joshCorr for more info or if you benefited from this blog.

Backing Up Facebook Photos using PowerShell


I have been using social media since 2006 when I first signed up for Myspace. That was the day and age when my parents had dial up and going on to check social media (or chat on AIM) was a 15 minute treat once a week (somewhat joking). Myspace was cool. You could edit your page. It was the ultimate self expression for an IT nerd wanna be (I had some pretty cool HTML and CSS on that page). After a few years Myspace was eh, but Facebook was the new hotness. No ads, clean, uniform, mature, verified users. I decided to get one in spring of 2008 and have used it up until recently. Within the last 2 years I started working on withdrawing from using it for various reasons (too many to get into here). The only thing that kept from deleting it long ago was the budding relationship with my wife which was thoroughly documented and photographed on Facebook.

So I decided to see if there was a way to backup the photos and albums in my Facebook account programmatically.


I could not find any apps that looked ligit for downloading facebook albums. I ran across a stack overflow article that described downloading photos from a Facebook page using their graph api (Get All Photos of a Page Using Facebook Api) which got me thinking. I am not a programmer, so writing a javascript or C# app to do it didn’t seem viable for accomplishing this task. However, I have a fair amount of experience with PowerShell which has Invoke-WebRequest and Invoke-RestMethod and figured I could use these cmdlets to gather the info and download these pictures.

In order to do this I had to learn a little bit:

Getting an Access Token

In order to connect to graph.facebook.com I first needed an Access Token to authorize my connection. You can generate this using code, but the easiest way I found to do this for a oAuth Noob, was in the following way.

  1. Login to Facebook and Navigate to https://developers.facebook.comDevelopers.facebook.com home page
  2. Click on “Tools & Support” in the navigation menu
  3. Click on “Graph API Explorer” under popular tools
  4. Click “Get Token”
  5. Click “Get User Access Token” on drop down
  6. Select the “user_photos” permission and click “Get Access Token” (I select “user_posts”, and “publish_actions” because I wanted to check posts as well as potentially update the photos)
  7. A facebook app authorization page will pop up. Accept this.
  8. Now you should have a valid Access Token above the API navigation bar
  9. Copy and Paste this Access Token into a PowerShell variable named $AccessToken

note: you can check the info on the token by clicking the blue i next to the Access Token to see when it expires. Mine usually lasted an hour and a half to two hours.

Finding my Photos

Now that I have an AccessToken I can use PowerShell to grab a list of my photos or my albums using some of the following code.

To gather all* photos I am tagged in:

Invoke-RestMethod -Method Get -Uri https://graph.facebook.com/v2.12/me/photos -Headers @{Authorization = "Bearer $AccessToken"}

To gather albums I have created and uploaded:

Invoke-RestMethod -Method Get -Uri https://graph.facebook.com/v2.12/me/albums -Headers @{Authorization = "Bearer $AccessToken"}

note: Be sure to include  'Bearer' in your Authorization header as you token will not work without this.

This execution returns a psCustomObject called Data which is a group of hashtables inside of a hashtable. The parent hastable can be expanded to show the ID, name and other fields of the child hashtables. If you want to include fields not in the default view you can add them to the end of your Uri with the following syntax.

Invoke-RestMethod -Method Get -Uri https://graph.facebook.com/v2.12/me/photos?fields=ID,Name,Images,Link -Headers @{Authorization = "Bearer $AccessToken"}

And if you want to return more than the default 25 add something like: &limit=100

Invoke-RestMethod -Method Get -Uri https://graph.facebook.com/v2.12/me/photos?fields=ID,Name,Images,Link&amp;limit=100 -Headers @{Authorization = "Bearer $AccessToken"}

For full reference on the methods, fields, and edges on this api please reference Graph Api Photo Reference and Graph Api Album Reference

Downloading my Photos

At this point I can stick the output of Invoke-RestMethod into a variable called $Pictures to which I will extract the ‘Data’ field into another Variable called $Images. (This is somewhat a redundant step, but I did it for ease later on in a loop).

$Images = $($Pictures.Data)

Initially I thought the Link field was where the photos were stored, but in fact this was only the link to the post of the picture. Also important to know, the ‘images’ field does not include one reference to the image, but usually 7-9 different sizes of the same image (Thankfully sorted from largest to smallest).

From here we can use Invoke-WebRequest in a loop to download the images. Each Image has its own source location which is the url to original file. The Photos can be saved by redirecting this source url to an outfile named after the image’s ID.

  foreach ($p in $Images){Invoke-WebRequest -Uri $($p.images[0].source) -OutFile "D:\TaggedPhotos\$($p.id).jpg" -Headers @{Authorization = "Bearer $AccessToken"} }

note: Not sure if this completely makes a difference, but I am using PowerShell v5.1 because core recently re-wrote Invoke-WebRequest to use HttpClient instead of WebRequest)


The results were that my tagged photos were downloaded to the requested location at the largest resolution on facebook. I did not do the same for my albums since you can download these entirely from Facebook, but it could be done with the same general method. It would require a few more loops as you need to get the Album ID first and then use this to discover all of the Photos inside of the album before finding each corresponding image source.

In the end this got most, but not all of my tagged photos. Depending on the permissions set by my friends I could not get specific pictures (however I could still see them on facebook). This was more of a learning experience for me as I had no prior experience with using an API and had never used oAuth Access Tokens in this way. Overall it was a great learning experience.

Checking Your Backup Strategy with DBAChecks

Within the last 18 months I have become an accidental DBA and found myself grasping for information and tools in order to help manage the handful of database servers for which I am responsible. In that journey I discovered dbatools dbatools.io which is a great community-built PowerShell module for configuring SQL Instances. If you’re not familiar with them, they have cmdlets that include the likes of Copy-DbaLogin, Find-DbaBackup, Get-DbaMemoryUsage, and Test-DbaPowerPlan. The last of these cmdlets I just listed really encapsulates the heart of the dbatools team as they are seeking to provide tools that reflect the best practices for data professionals. (If you don’t know that the Power Plan improperly set can cause performance issues, you should probably download dbatools and give it a shot).

Continuing this desire to help the community practice good things, the dbatools team has now released dbachecks dbachecks.io. This is a module which leverages pester (a powershell module for running unit testing) to run tests against your sql instances. You can find full details on what is required and how to install this module on their GitHub page here.

There are a large number of tests included with the dbachecks module, but you can create your own if you are familiar with writing tests in pester. A few of the built in tests that I found most helpful (and actually made me re-evaluate my production systems) were the tests for Backups.

After loading the module you can check out all of the predefined tests by running Get-DbcCheck. In order to get a list of all of the tests that check backups, use the -Pattern parameter to search for tests with matching keywords.

Get-DbcCheck -Pattern backup

In order to run the test you execute Invoke-DbcCheck with the following syntax to return successful passes of tests and failures.

Invoke-DbcCheck -Check BackupPathAccess,DefaultBackupCompression -SqlInstance localhost -ComputerName localhost

As you can see the backup location is accessible, and I have not configured compression for my backup sets. It also only took less than a half a second to run these checks. Now one of the most commonly unchecked aspects of databases backup is it will restore, and conveniently enough dbachecks has a test for that as well. Under the hood, this test is using the Test-DbaLastBackup to try restoring the database with the flag -VerifyOnly only:

Invoke-DbcCheck -Check TestLastBackupVerifyOnly -SqlInstnace localhost -ComputerName localhost

The user databases for my SSRS instances were successfully tested, but there were no backups for my system databases. As a result I decided to double check and everything was in order, but you can never be sure.

So my suggestion is to set this as a scheduled task, or agent job and make sure that your backups are up to date. This module makes it so easy to make sure that you don’t lose your job because of data loss. This is just a highlight of a few of theses tests. Imagine the rest of the checks and tests that are possible with this module whether they are built in or they are one of your own creation. Download the module from the PowerShell Gallery and give it a try.



Deploying SQL 2017 in 15 minutes

This last Wednesday (10/11/17) I had the opportunity to share about PowerShell DSC at JAXSUG and I wanted to follow up and share a little more about what I was doing in my demo.

PowerShell Desired State Configuration (referred to as DSC from here on) is Microsoft’s new attempt to make server configuration easier and more stable for the Admin. DSC is a feature that is available in any version of Windows Server (or even Client) running PowerShell Version 4 or greater. One of the draws of DSC is that the configurations are human readable, declarative configurations, which allows ease of understanding for anyone reading the script. Additionally, these configurations provide a way to configure nodes in mass as well as offering a way to keep nodes from creeping out of its configuration.

Each node has a Local Configuration Manager(LCM), which is responsible for processing the configurations it receives. You can find out what your LCM by running Get-DSCLocalConfigurationManager in an Administrative PowerShell console. PowerShell is used to create the configurations as a .MOF file (or meta.mof if it is for the Local Configuration Manager). These files comply with the CIM (Common Information Model) which is an open standard and used by many different platforms. (Just a cool side note there are some people who use MOF files to control Linux systems and Networking devices, which have been made using PowerShell).

Each Configuration that is written may also need additional resources in order to extend the type of commands that can be given. By default PowerShell knows about 13 different resources for manipulating files and server settings. In order to Install SQL Server I needed to download and use the xSQLServer resource which may be found on GitHub

Below is a sample of the code for my LCM from my Demo with some comments

#These settings produce a $computer.meta.mof
# and are used to Setup the LocalConfiguration Manager
Configuration LCM_Push
    #Paramater block allows this configuration to be used many times
    #The Node keyword is used to determine 
    #what the name of the meta.mof will be called
    Node $ComputerName
         #These settings are only a few of all that are possible
         AllowModuleOverwrite = $True 
         ConfigurationMode = 'Applyonly'
         RefreshMode = 'Push'
         RebootNodeIfNeeded = $True
         #The CertificateID allows you to use a Document Signing 
         #certificate to Encrypt credentials within the .Mof file
         CertificateID = "56201A6926C2134876123409874C437E7E66BD"

If you would like to check out the script I used you can download it here: Push-SQL2017.

While my Demo only took about 15 minutes to preform the, the initial setup of the environment (Domain Controller, Authoring Host, and deployment Vm), the certificate server and certificates for encrypting the MOF (option in test, but necessary in prod), and the Configuration itself took many hours of testing and troubleshooting it has turned out to be a necessary tool as many requests come in to me for new instances or test instances and I now know I can reliably roll out a new version with a few clicks.


Note: This is the last of my series on dbatools.io and upon publishing it I discovered that dbatools is changing their prefix to be Verb-DBAnoun. Read about it here https://dbatools.io/bagofbobbish-beta/

In the series on dbatools module (https://dbatools.io) we at are going to look next at the Copy-SqlLinkedServer

Something that may not be as widely used by some organizations, but was widely used on one of my servers are linked servers. These are usually a work around for some other Oracle DB or even a means of connecting to a reporting data source. These connections can be complicated as they sometimes have local credentials stored in the connection and have no means of retrieving the password. By using this command in the following way you may copy the linked server to another SQL instance with ease.

Copy-SQLLinkedServer -Source SQL2012 -Destination SQL2016 -LinkedServer OracleDB2

Any settings that you need for the linked server (RPC OUT, Data Access, etc) should be turned on before moving the linked server with this method as it only copies the name and login (if present) and inherits all other settings from the destination instance.



In the series on dbatools module (https://dbatools.io) we at are going to look next at the Copy-SqlDatabaseMail

If you are not in a particularly large shop (or a well funded one) you may not have the most robust monitoring solution, but you may at least have an open Email relay for you use. Setting up Email alerts on an Sql instance are very important so that you may receive notifications of job failures, or completions of integration service jobs. Additionally you may have triggered jobs which notify you of specific database fill. Once you configure these settings it may be a pain to recreate these on another instance especially if you are not the email admin.

By using this syntax you can easily migrate the settings

Copy-SQLDatabaseMail -Source SQL2012 -Destination SQL2016

After migrating the settings you may still need to enable the mail services on the specific instance you are migrating towards (right click agent > Alert System > Enable Mail Profile), but once you do this you will have all the settings in place for your alerts. Now you may also want to use the Copy-SqlOperator if you have specific Email addresses setup on your Jobs. This is absolutely a pre-requisite before you migrate specific jobs from another instance.



In the last post on the dbatools module (https://dbatools.io) we talked about the Copy-SQLCredential. Using this previous cmdlet may need to be a prerequisite before using Copy-SQLJob.

The Copy-SQLJob will move SQL Agent jobs, including the schedule from one instance to another instance. This can be highly efficient if moving an instance from one server to another server. The only downside of this cmdlet is that it will copy the job with the exact same object id, which means that if you are trying to use this to duplicate jobs as a template will cause them to be linked on a single instance. This means that if you change a setting on one it will affect the other jobs copied from the same instance.

The syntax is as follows

Copy-SqlJob -Source SQL2012 -Destination SQL2016 -jobs NightlyMaintenance,HourlyTransactionBackups