Dbatools to V1.0 and Beyond

Good News, Everyone!

Dbatools has finally reached critical mass to be released as version 1.0. This is a major step for the maintainers and developers of this module. It has been the goal for over 18months. The quick and dirty on it is this:

  • Cross Platform Support
  • MFA for Azure Support in cmdlets
  • New cmdlets for managing DBRoles
  • Change Log of every subversion shipped to the PowerShell Gallery
  • Digitally signed so cmdlets work in restricted environments
  • Optimized for importing
  • Over 550 cmdlets available for automating Microsoft SQL Instances
    • Nearly 5x the cmdlets available in the MS SQL Server Module 
  • The quality of the cmdlets has also gone up with 1000s of Integration tests
  • Parameters, Internal Code structure and Error handling have been Standardized

These are just the high points that stick out in my mind at this time. There are quite a few blogs being released today highlighting some of the other great features coming to V1.0 (Read More at https://dbatools.io/dbatools10)

If this is the first you are hearing about Dbatools download it now (Install-Module Dbatools) and give it a spin!

Personal Journey to V1.0 pro

For me personally, the journey for Dbatools reaching 1.0 has been a big part of my own professional growth.

Back in 2016 I started on a team that was understaffed and drowning in technical debt. Suddenly, I was handed the Dba responsibilities after a senior systems engineer retired. I did not know much about Databases, and was more of a PowerShell enthusiast. I ended up discovering Dbatools while trying to troubleshoot a network latency issue with our Data Architect. I then started to use it further when migrating from a 2012 AG to 2016 AG cluster. (Using this module actually helped inspire me to start this blog.)

In 2017 I attended PASS Summit trying to up my Dba skills, but found myself drawn to all of the PowerShell sessions. I just happened to see Aaron Nelson give a talk on Reporting Services w/ PowerShell and was trying to find him at lunch to ask a question. I sat down at his table, but instead of talking with him I ended up talking with Stuart Moore, Sander Stad, and ConstantineK, about how I was using dbatools and just happened to find out that these guys were some of the main contributors. (I also accidentally stalked them the rest of the week). Later that week I also met Chrissy LeMarie and Shawn Melton who both encouraged me to contribute (Thanks for doing this and being Awesome!). 

I did a few PRs in 2017, but finally in 2018, I started to try to do some more consistent PRs to the project. While I have never felt like a true Data Professional, I felt pretty confident in PowerShell, so most of my pull requests have been focused around the Integration Tests, and PSScript Analyzer rules. While writing tests is not glamorous, I honestly used it as an opportunity to learn various aspects of SQL Servers since most integration tests require you to get/set/update objects on an instance. So if I am writing a test to Copy a DB Mail object, I probably need to understand how to configure it using TSQL first. 

And Beyond!

Now that Dbatools is at Version 1.0, the expectation is that there will be more stable changes and notification of any major breaking change. There will also be more development for cloud native databases, continued support for cmdlets running on Linux, and whatever cmdlets the community needs.

I am personally going to continue working on integration tests to raise the overall code coverage of the cmdlets. I would also like to work on adding requested cmdlets that have been sitting in the queue for a few months, and maybe even some Linux specific cmdlets.


Not enough can be said for Chrissy, and the hard work she and the team have been doing in launching Dbatools into this next phase. The beauty of the SQL Collaborative is that they have been able to quickly react to the needs of the MS SQL community and have a promise to keep doing this in the years to come.

Creating Word Documents with PowerShell


For the last few months I have been working on a course called “Introduction to Windows PowerShell” for Pragmatic Works. This has been a challenging and rewarding task as I have found that you learn something best when you teach it to someone else.

Part of the deliverables for this course was providing labs for students to walk through on their own. Also for each lab, there is a solution that I provide in case they get stuck. On the first pass I had placed all of these labs and solutions in a separate .ps1 file, but after further review it was going to be better if they were all in a single Word Document. This however meant combining roughly 20 files into 2 Documents.

Being somewhat crunched for time, I decided to see if I could automate feeding these .ps1 files into a Word Document using PowerShell.


My gut response for the last few years has been to automate  as many things as possible using PowerShell. If this could not be done entirely with PowerShell, at least I would try and do as much as possible.

In order to do this I had to learn a little bit:

  • Are there any CLI options for Word?
  • If not, are there any .NET Classes exposed for Word?
  • And if either of these first two were true could  I produce a new script?

.NET Classes for Microsoft Word

Upon doing some digging I found a couple forum posts on manipulating files using PowerShell (StackOverflow, MSDN ). What caught my eye from these posts is that they both used ‘Word.Application’ as a COM object. Looking into this, led me to information about Word’s VBA interface for scripting (MSDN – VBA). The VBA scripting  wasn’t exactly what I was looking for, but I figured I would keep digging. Finally I found the gold at the end of the rainbow. The entire .NET namespace documentation for Office 2013+  (MSDN – Namespace). There is a lot that this documentation covers, but I will just walk through the basics of creating a file, formatting the text, adding text from other files, and then finally saving the document.

Creating a File

In order to create a file you must have an instance of Word in your PowerShell Session. In order to do this you can use the New-Object cmdlet and use the -ComObject parameter as shown below:

PS C:\WINDOWS\system32> $WordApplication = New-Object -ComObject "word.application"

This starts a instance of Word, which you can view in processes, but does not open a visual window. If you want to see the window you can do the following:

PS C:\WINDOWS\system32> $WordApplication.Visible = $true

If you look at Word, it is now visible, but no document is open. We must now create the document we want to use. We could also open up a document if we had an existing one.

Microsoft Word

Off of the $WordApplication variable we can create a new document. It is best to save this new Document into another variable for future use.

 PS C:\WINDOWS\system32> $Document = $WordApplication.Documents.Add()

At this point you now have a document to work with.

Writing the Document

By this point I spent a lot of time trying and failing. Since there was both a variable for the Application ( $WordApplication ) and the Document ( $Document ), it was difficult determining which one I needed to writing from my sources. I first assumed it was the Document variable because that is what I was editing, but all of the interfaces were through the ‘Selection’ interface on the Application variable.

PS C:\WINDOWS\system32> $Writer = $WordApplication.Selection

There are several things I wanted to do inside of this document. First I wanted to bring content in from various files, but I also wanted to create a heading for each file that was written. The first part seemed easy, because there exists a method called .TypeText() . I found that you can insert text by writing a string or calling another cmdlet like Get-Content inside of this method. Also I found that you could set the style of the document before you ‘typed’ and then change it back to the normal format. The most confusing part was the inconsistency in the methods. When calling $Writer.TypeText((Get-Content -path $blah)) you have parentheses around the value, but when changing the style you assigned a value like this $Writer.Style = 'Heading 1'.

In the end, I wrote a foreach loop in the following way to import all of the text while setting the style on import.

 PS C:\WINDOWS\system32> 1..10 | foreach -Process {
>> $labpath = "C:\PragmaticWorks\Introduction to Windows PowerShell\Module$_\Lab-Solution.ps1"
>> $Writer.Style = 'Heading 1'
>> $Writer.TypeText("Module $_ Lab Solution")
>> $Writer.TypeParagraph()
>> $Writer.Style = 'Normal'
>> $Writer.TypeText((Get-Content $labpath))
>> $Writer.TypeParagraph()
>> }

note: .TypeParagraph() inserts a new line so that I would have a break between text.

When I finished inserting all of my files I saved the file and closed the document.

PS C:\WINDOWS\system32> $Document.SaveAs($filepath)
PS C:\WINDOWS\system32> $Document.Close()


This was a fun conceptual project to see if it was possible, and it did end up saving me the pain of copy and pasting all 20 documents into 2 separate documents. I did have some issues with formatting, which I would have loved to explore further, but I am not sure I could have automated this formatting as I had code examples in my source documents. I ended up manually aligning these code examples once in word. I did notice a .Find() interface on the Application variable and this would be neat to explore deeper for gathering information from Word Documents on distributed systems.

An MVP I follow suggested that I make this into a module. What are your thoughts? Would you like to see a PowerShell module for interfacing with documents? Feel free to tweet me @joshCorr for more info or if you benefited from this blog.

Backing Up Facebook Photos using PowerShell


I have been using social media since 2006 when I first signed up for Myspace. That was the day and age when my parents had dial up and going on to check social media (or chat on AIM) was a 15 minute treat once a week (somewhat joking). Myspace was cool. You could edit your page. It was the ultimate self expression for an IT nerd wanna be (I had some pretty cool HTML and CSS on that page). After a few years Myspace was eh, but Facebook was the new hotness. No ads, clean, uniform, mature, verified users. I decided to get one in spring of 2008 and have used it up until recently. Within the last 2 years I started working on withdrawing from using it for various reasons (too many to get into here). The only thing that kept from deleting it long ago was the budding relationship with my wife which was thoroughly documented and photographed on Facebook.

So I decided to see if there was a way to backup the photos and albums in my Facebook account programmatically.


I could not find any apps that looked ligit for downloading facebook albums. I ran across a stack overflow article that described downloading photos from a Facebook page using their graph api (Get All Photos of a Page Using Facebook Api) which got me thinking. I am not a programmer, so writing a javascript or C# app to do it didn’t seem viable for accomplishing this task. However, I have a fair amount of experience with PowerShell which has Invoke-WebRequest and Invoke-RestMethod and figured I could use these cmdlets to gather the info and download these pictures.

In order to do this I had to learn a little bit:

Getting an Access Token

In order to connect to graph.facebook.com I first needed an Access Token to authorize my connection. You can generate this using code, but the easiest way I found to do this for a oAuth Noob, was in the following way.

  1. Login to Facebook and Navigate to https://developers.facebook.comDevelopers.facebook.com home page
  2. Click on “Tools & Support” in the navigation menu
  3. Click on “Graph API Explorer” under popular tools
  4. Click “Get Token”
  5. Click “Get User Access Token” on drop down
  6. Select the “user_photos” permission and click “Get Access Token” (I select “user_posts”, and “publish_actions” because I wanted to check posts as well as potentially update the photos)
  7. A facebook app authorization page will pop up. Accept this.
  8. Now you should have a valid Access Token above the API navigation bar
  9. Copy and Paste this Access Token into a PowerShell variable named $AccessToken

note: you can check the info on the token by clicking the blue i next to the Access Token to see when it expires. Mine usually lasted an hour and a half to two hours.

Finding my Photos

Now that I have an AccessToken I can use PowerShell to grab a list of my photos or my albums using some of the following code.

To gather all* photos I am tagged in:

Invoke-RestMethod -Method Get -Uri https://graph.facebook.com/v2.12/me/photos -Headers @{Authorization = "Bearer $AccessToken"}

To gather albums I have created and uploaded:

Invoke-RestMethod -Method Get -Uri https://graph.facebook.com/v2.12/me/albums -Headers @{Authorization = "Bearer $AccessToken"}

note: Be sure to include  'Bearer' in your Authorization header as you token will not work without this.

This execution returns a psCustomObject called Data which is a group of hashtables inside of a hashtable. The parent hastable can be expanded to show the ID, name and other fields of the child hashtables. If you want to include fields not in the default view you can add them to the end of your Uri with the following syntax.

Invoke-RestMethod -Method Get -Uri https://graph.facebook.com/v2.12/me/photos?fields=ID,Name,Images,Link -Headers @{Authorization = "Bearer $AccessToken"}

And if you want to return more than the default 25 add something like: &limit=100

Invoke-RestMethod -Method Get -Uri https://graph.facebook.com/v2.12/me/photos?fields=ID,Name,Images,Link&limit=100 -Headers @{Authorization = "Bearer $AccessToken"}

For full reference on the methods, fields, and edges on this api please reference Graph Api Photo Reference and Graph Api Album Reference

Downloading my Photos

At this point I can stick the output of Invoke-RestMethod into a variable called $Pictures to which I will extract the ‘Data’ field into another Variable called $Images. (This is somewhat a redundant step, but I did it for ease later on in a loop).

$Images = $($Pictures.Data)

Initially I thought the Link field was where the photos were stored, but in fact this was only the link to the post of the picture. Also important to know, the ‘images’ field does not include one reference to the image, but usually 7-9 different sizes of the same image (Thankfully sorted from largest to smallest).

From here we can use Invoke-WebRequest in a loop to download the images. Each Image has its own source location which is the url to original file. The Photos can be saved by redirecting this source url to an outfile named after the image’s ID.

  foreach ($p in $Images){Invoke-WebRequest -Uri $($p.images[0].source) -OutFile "D:\TaggedPhotos\$($p.id).jpg" -Headers @{Authorization = "Bearer $AccessToken"} }

note: Not sure if this completely makes a difference, but I am using PowerShell v5.1 because core recently re-wrote Invoke-WebRequest to use HttpClient instead of WebRequest)


The results were that my tagged photos were downloaded to the requested location at the largest resolution on facebook. I did not do the same for my albums since you can download these entirely from Facebook, but it could be done with the same general method. It would require a few more loops as you need to get the Album ID first and then use this to discover all of the Photos inside of the album before finding each corresponding image source.

In the end this got most, but not all of my tagged photos. Depending on the permissions set by my friends I could not get specific pictures (however I could still see them on facebook). This was more of a learning experience for me as I had no prior experience with using an API and had never used oAuth Access Tokens in this way. Overall it was a great learning experience.

Checking Your Backup Strategy with DBAChecks

Within the last 18 months I have become an accidental DBA and found myself grasping for information and tools in order to help manage the handful of database servers for which I am responsible. In that journey I discovered dbatools dbatools.io which is a great community-built PowerShell module for configuring SQL Instances. If you’re not familiar with them, they have cmdlets that include the likes of Copy-DbaLogin, Find-DbaBackup, Get-DbaMemoryUsage, and Test-DbaPowerPlan. The last of these cmdlets I just listed really encapsulates the heart of the dbatools team as they are seeking to provide tools that reflect the best practices for data professionals. (If you don’t know that the Power Plan improperly set can cause performance issues, you should probably download dbatools and give it a shot).

Continuing this desire to help the community practice good things, the dbatools team has now released dbachecks dbachecks.io. This is a module which leverages pester (a powershell module for running unit testing) to run tests against your sql instances. You can find full details on what is required and how to install this module on their GitHub page here.

There are a large number of tests included with the dbachecks module, but you can create your own if you are familiar with writing tests in pester. A few of the built in tests that I found most helpful (and actually made me re-evaluate my production systems) were the tests for Backups.

After loading the module you can check out all of the predefined tests by running Get-DbcCheck. In order to get a list of all of the tests that check backups, use the -Pattern parameter to search for tests with matching keywords.

Get-DbcCheck -Pattern backup

In order to run the test you execute Invoke-DbcCheck with the following syntax to return successful passes of tests and failures.

Invoke-DbcCheck -Check BackupPathAccess,DefaultBackupCompression -SqlInstance localhost -ComputerName localhost

As you can see the backup location is accessible, and I have not configured compression for my backup sets. It also only took less than a half a second to run these checks. Now one of the most commonly unchecked aspects of databases backup is it will restore, and conveniently enough dbachecks has a test for that as well. Under the hood, this test is using the Test-DbaLastBackup to try restoring the database with the flag -VerifyOnly only:

Invoke-DbcCheck -Check TestLastBackupVerifyOnly -SqlInstnace localhost -ComputerName localhost

The user databases for my SSRS instances were successfully tested, but there were no backups for my system databases. As a result I decided to double check and everything was in order, but you can never be sure.

So my suggestion is to set this as a scheduled task, or agent job and make sure that your backups are up to date. This module makes it so easy to make sure that you don’t lose your job because of data loss. This is just a highlight of a few of theses tests. Imagine the rest of the checks and tests that are possible with this module whether they are built in or they are one of your own creation. Download the module from the PowerShell Gallery and give it a try.



Deploying SQL 2017 in 15 minutes

This last Wednesday (10/11/17) I had the opportunity to share about PowerShell DSC at JAXSUG and I wanted to follow up and share a little more about what I was doing in my demo.

PowerShell Desired State Configuration (referred to as DSC from here on) is Microsoft’s new attempt to make server configuration easier and more stable for the Admin. DSC is a feature that is available in any version of Windows Server (or even Client) running PowerShell Version 4 or greater. One of the draws of DSC is that the configurations are human readable, declarative configurations, which allows ease of understanding for anyone reading the script. Additionally, these configurations provide a way to configure nodes in mass as well as offering a way to keep nodes from creeping out of its configuration.

Each node has a Local Configuration Manager(LCM), which is responsible for processing the configurations it receives. You can find out what your LCM by running Get-DSCLocalConfigurationManager in an Administrative PowerShell console. PowerShell is used to create the configurations as a .MOF file (or meta.mof if it is for the Local Configuration Manager). These files comply with the CIM (Common Information Model) which is an open standard and used by many different platforms. (Just a cool side note there are some people who use MOF files to control Linux systems and Networking devices, which have been made using PowerShell).

Each Configuration that is written may also need additional resources in order to extend the type of commands that can be given. By default PowerShell knows about 13 different resources for manipulating files and server settings. In order to Install SQL Server I needed to download and use the xSQLServer resource which may be found on GitHub

Below is a sample of the code for my LCM from my Demo with some comments

#These settings produce a $computer.meta.mof
# and are used to Setup the LocalConfiguration Manager
Configuration LCM_Push
    #Paramater block allows this configuration to be used many times
    #The Node keyword is used to determine 
    #what the name of the meta.mof will be called
    Node $ComputerName
         #These settings are only a few of all that are possible
         AllowModuleOverwrite = $True 
         ConfigurationMode = 'Applyonly'
         RefreshMode = 'Push'
         RebootNodeIfNeeded = $True
         #The CertificateID allows you to use a Document Signing 
         #certificate to Encrypt credentials within the .Mof file
         CertificateID = "56201A6926C2134876123409874C437E7E66BD"

If you would like to check out the script I used you can download it here: Push-SQL2017.

While my Demo only took about 15 minutes to preform the, the initial setup of the environment (Domain Controller, Authoring Host, and deployment Vm), the certificate server and certificates for encrypting the MOF (option in test, but necessary in prod), and the Configuration itself took many hours of testing and troubleshooting it has turned out to be a necessary tool as many requests come in to me for new instances or test instances and I now know I can reliably roll out a new version with a few clicks.


Note: This is the last of my series on dbatools.io and upon publishing it I discovered that dbatools is changing their prefix to be Verb-DBAnoun. Read about it here https://dbatools.io/bagofbobbish-beta/

In the series on dbatools module (https://dbatools.io) we at are going to look next at the Copy-SqlLinkedServer

Something that may not be as widely used by some organizations, but was widely used on one of my servers are linked servers. These are usually a work around for some other Oracle DB or even a means of connecting to a reporting data source. These connections can be complicated as they sometimes have local credentials stored in the connection and have no means of retrieving the password. By using this command in the following way you may copy the linked server to another SQL instance with ease.

Copy-SQLLinkedServer -Source SQL2012 -Destination SQL2016 -LinkedServer OracleDB2

Any settings that you need for the linked server (RPC OUT, Data Access, etc) should be turned on before moving the linked server with this method as it only copies the name and login (if present) and inherits all other settings from the destination instance.



In the series on dbatools module (https://dbatools.io) we at are going to look next at the Copy-SqlDatabaseMail

If you are not in a particularly large shop (or a well funded one) you may not have the most robust monitoring solution, but you may at least have an open Email relay for you use. Setting up Email alerts on an Sql instance are very important so that you may receive notifications of job failures, or completions of integration service jobs. Additionally you may have triggered jobs which notify you of specific database fill. Once you configure these settings it may be a pain to recreate these on another instance especially if you are not the email admin.

By using this syntax you can easily migrate the settings

Copy-SQLDatabaseMail -Source SQL2012 -Destination SQL2016

After migrating the settings you may still need to enable the mail services on the specific instance you are migrating towards (right click agent > Alert System > Enable Mail Profile), but once you do this you will have all the settings in place for your alerts. Now you may also want to use the Copy-SqlOperator if you have specific Email addresses setup on your Jobs. This is absolutely a pre-requisite before you migrate specific jobs from another instance.



In the last post on the dbatools module (https://dbatools.io) we talked about the Copy-SQLCredential. Using this previous cmdlet may need to be a prerequisite before using Copy-SQLJob.

The Copy-SQLJob will move SQL Agent jobs, including the schedule from one instance to another instance. This can be highly efficient if moving an instance from one server to another server. The only downside of this cmdlet is that it will copy the job with the exact same object id, which means that if you are trying to use this to duplicate jobs as a template will cause them to be linked on a single instance. This means that if you change a setting on one it will affect the other jobs copied from the same instance.

The syntax is as follows

Copy-SqlJob -Source SQL2012 -Destination SQL2016 -jobs NightlyMaintenance,HourlyTransactionBackups


In the series on dbatools module (https://dbatools.io) we at are going to look next at the Copy-SqlCredential.

As you may or may not know, if a DBA is going to keep his database up to shape and continue to process ETL (Extract Transform and Load) jobs they need to leverage automation. One method that is provided by SQL is the Agent, which may have jobs that leverage all sorts of items (SQL, SSIS packages, even Batch and PowerShell Scripts). Most of these jobs can run as the service account running the agent, but in some cases (Batch and SSIS among others) a Credential and Proxy are needed to run these jobs.

In migrating from one server to another, the DBA may not have access to the passwords for the Credentials that exist to run certain packages. I found that the Copy-SQLCredential to be helpful for this. In addition, this can be leveraged in the following way with Copy-SQLProxyAccount.

Copy-SQLCredential -Source SQL2012 -Destination SQL2016 -Credential 'Domain\CredentialedAccount'
Copy-SQLProxyAccount -Source SQL2012 -Destination SQL2016 -Credential 'Proxy_CredentialedAccount'



As I mentioned in my last post, I am going to be spending some time going over a couple of my favorite tools from the dbatools (https://dabtool.io) PowerShell module.

The one cmdlet that actually led me to the module in the first place, which has been a tried and true tool for me, is the “Copy-SQLlogin.” I am a relatively new to the work of a DBA, but one of the most difficult lessons I learned early on is that if you have a SQL Always On Availability Group (AG) setup the you have to maintain a copy of the Logins on all replicas where you need to potentially access the data from in a read capacity. From what I have read, to keep these groups in sync you would need to either grab the Login, “Create to Script” on the other replica or use partial containment to get around this problem.

This cmdlet however, offers a method similar to the first one, but with the options of syncing all of the Logins or only select Logins

My preferred method of using this command would be in the following syntax

  Copy-SQLLogin -Source SQL2012 -Destination SQL2016 -Logins 'Domain\ServiceAccount','LocalSQLLogin'

This provides you the flexibility to copy the Logins for the databases that you may be moving. After doing this from the original source I would then switch the source and destination for the primary and any secondary replica in the cluster. One caviat that I found to this is that if you have another AG on the secondary that is not a readable secondary or is read-intent only this cmdlet may fail or provide a warning that not all the databases were available.

Another robust version of this cmdlet is just using it with the SyncOnly

  Copy-SQLLogin -Source SQL2012 -Destination SQL2016 -synconly

This does not add or drop the user, but merely syncs up the objects securables. For Instances that are 2012 and newer this cmdlet can be used to move the SIDS, passwords, and other items. It makes it a powerful addition to any DBA’s tool belt.