Send email on Azure Automation runbook failure

One of the downsides of using Azure Automation runbooks is the visibility of the runbook-outcomes. That’s one of the few downsides I could actually find when I wrote my “Running database maintenance on Azure SQL DB with Azure Automation” post. But because we execute the runbooks against our production environment, visibility is of high importance to us.

To get some more insights into this, I’ve implemented a try-catch block in every runbook we have, and send an email to our ITOps team when a runbook fails. So how did we do this?

 
Create an email runbook
To keep it a bit modular (and I know it could be set up even better), I’ve made a runbook that is only used to send an email to our IT Operations team. This email includes the runbook name and the error that occurred.

Because we are going to include sensitive account information in a Azure Automation runbook, it’s a wise idea to store the username and password of your Office365 account into a “Credential” in Azure Automation. This safely stores your account information, and allows you to use that in your runbook later. This option can be found in the Automation Account’s left menu.

Now that we’ve done that, let’s create a runbook that sends an email:

workflow EmailError
{
  Param
  (
    # Input parameters
    [Parameter (Mandatory = $true)]
    [string] $RunbookName,

    [Parameter (Mandatory = $true)]
    [string] $MessageBody
  )

    # RetrieveOffice 365 credential from Azure Automation Credentials
    $O365Credential = Get-AutomationPSCredential -Name "CredentialName"
    
    InlineScript
    {
        # Create new MailMessage
        $Message = New-Object System.Net.Mail.MailMessage
        
        # Set address-properties
        $Message.From = "no-reply@domain.com"
        $Message.replyTo = "no-reply@domain.com"
        $Message.To.Add("ITOPS@domain.com, support@domain.com")
  
        # Set email subject
        $Message.SubjectEncoding = ([System.Text.Encoding]::UTF8)
        $Message.Subject = "Failed job: $Using:RunbookName"
        
        # Set email body
        $Message.Body = "Error message: <br /><br /><br /><br /><br /> $Using:MessageBody"
        $Message.BodyEncoding = ([System.Text.Encoding]::UTF8)
        $Message.IsBodyHtml = $true
        
        # Create and set SMTP
        $SmtpClient = New-Object System.Net.Mail.SmtpClient 'smtp.office365.com', 587
        $SmtpClient.Credentials = $Using:O365Credential
        $SmtpClient.EnableSsl   = $true
  
        # Send email message
        $SmtpClient.Send($Message)
    }
}

 
This runbook sends an email from “no-reply@domain.com” to “ITOPS@domain.com”, and includes the input parameters (RunbookName and MessageBody) in the subject and body of the email. You can customize this based on your preferences.

 
Call the runbook
The way we use this runbook is by calling this from another runbook that runs our database maintenance. This runbook is copied from this post, except for the try-catch block.

By adding the $ErrorActionPreference and try-catch block, it’s possible to receive an email if a runbook fails:

workflow DBM_Servername_Databasename
{
    try
    {
        # Set error action preference
        $ErrorActionPreference = 'Stop'

        inlinescript
        {
                # Output status to console
                Write-Output "Creating connection..."

                # Create connection to database
                $DatabaseConnection = New-Object System.Data.SqlClient.SqlConnection
                $DatabaseConnection.ConnectionString = "Data Source=Servername.database.windows.net;Initial Catalog=DatabaseName;Integrated Security=False;User ID=UserName;Password=Password"
                $DatabaseConnection.Open()

                # Output status to console
                Write-Output "Creating command..."

                # Create command to execute
                $DatabaseCommand = New-Object System.Data.SqlClient.SqlCommand
                $DatabaseCommand.Connection = $DatabaseConnection
                $DatabaseCommand.CommandText = "EXEC dba.ExecuteMaintenance"
                $DatabaseCommand.CommandTimeout = 0;

                # Output status to console
                Write-Output "Executing payload..."

                # Execute the query
                $DatabaseCommand.ExecuteNonQuery()

                # Output status to console
                Write-Output "Closing connection..."

                # Close connection to database
                $DatabaseConnection.Close()
        }
    }
    catch
    {
        # Output status to console
        Write-Output "Catchie catchie..."
    
        # Set $errorMessage
        $errorMessage = $_
        
        # Call EmailError-runbook in order to send a failure-email
        EmailError `
		-RunbookName "DBM_Servername_Databasename" `
        -MessageBody $errorMessage

        # Output status to console
        Write-Output "Mail sent..."
    }
}

 
I’ve included a bunch of “Write-Output” lines in there, so that if I test it through the Azure portal I can see the status of the runbook. You can remove those if you want to have a clean runbook of course.

 
Conclusion
By using PowerShell to automate this type of work its easy to extend your jobs like we did. Adding an email component is just as easy as altering the payload in your runbook. This extensibility and modular approach is the way to go, if you want to deliver a solution that is easy to maintain and scalable.

And let me tell you something, that PowerShell is going to be big someday, mark my words! 😉

Advertisements

Running database maintenance on Azure SQL DB with Azure Automation

This article was also published on dev.getroadmap.com:

 
 
 
 
Running all of your databases in Azure SQL DB (the PaaS solution, also known as Azure SQL database) is a wonderful thing, except for one thing: you still need to run database maintenance. And because the SQL Server Agent is not available (which is a shame in my opinion), you have to find a way around that.

In one of my previous posts (Running maintenance on Azure SQL databases), I showed you how to leverage the power of the database maintenance solution, written by Ola Hallengren (Website | @olahallengren). But now that we’re moving into Azure completely, we’re losing that work-around. So how did we fix this?

 
Azure Automation
Microsoft gives you a tool to automate your processes with something called “Azure Automation”. With this tool you can schedule “Runbooks” that contain PowerShell. This allows you to execute the stored procedure that is executing the actual maintenance on your database.

 
Creating a runbook
In order to create a runbook, you need to login to the Azure portal, navigate to “Automation Accounts”, and create a new account:

 
When that is created (this usually only takes a few seconds), open the accounts, and click “Runbooks”. Click “Add a runbook” to create a new runbook, choose “Create a runbook”, and give your new runbook a name:

 
It is important that you choose “PowerShell Workflow” as Runbook type.

 
Adding the script
In this example I use the scripts I blogged about earlier. This allows me to just execute a single stored procedure to kick off the maintenance process.

By adding this to the Runbook a connection to the specific database is opened, and the stored procedure is executed:

 
For your benefit, here is the same script so you can copy-paste it:

    Write-Output "Ready to start inlinescript..."

    inlinescript
    {
        Write-Output "Creating connection..."

        # Create connection to database
        $DatabaseConnection = New-Object System.Data.SqlClient.SqlConnection
        $DatabaseConnection.ConnectionString = "Data Source=Servername.database.windows.net;Initial Catalog=DatabaseName;Integrated Security=False;User ID=UserName;Password=Password"
        $DatabaseConnection.Open()

        Write-Output "Creating command..."

        # Create command
        $DatabaseCommand = New-Object System.Data.SqlClient.SqlCommand
        $DatabaseCommand.Connection = $DatabaseConnection
        $DatabaseCommand.CommandText = "EXEC dba.ExecuteMaintenance"
        $DatabaseCommand.CommandTimeout = 0;

        Write-Output "Executing payload..."

        # Execute the query
        $DatabaseCommand.ExecuteNonQuery()

        Write-Output "Closing connection..."

        # Close connection to database
        $DatabaseConnection.Close() 
    }

 
To start using this Runbook, you don’t only need to save it, but also publish it:

 
Schedule
Now that you have a Runbook, you can schedule it to run at the time you want it to. You can do that by clicking on “Schedule”, and either create a new schedule or select an existing one (if you want to use a schedule you created earlier):

 
Conclusion
The Azure Automation is easy to use, and allows you to do almost anything on Azure by using PowerShell. But the only downside I found until now is that the job output and outcome is kind of tucked away into Azure. There’s no way to send out an email when a job fails for example (which was the setup we used on-premise).

There must be a way around that, but I haven’t found that until now. But if I do, that sounds like a good follow-up blog post! 🙂

T-SQL Tuesday #39 – Can you shell what the PoSH is Cooking?

T-SQL Tuesday is a recurring blog party, that is started by Adam Machanic (Blog | @AdamMachanic). Each month a blog will host the party, and everyone that want’s to can write a blog about a specific subject.

This month the subject is “Can you shell what the PoSH is Cooking?”. If you want to read the opening post, please click the image below to go to the party-starter: Wayne Sheffield (Blog | @DBAWayne).



A few months ago I attended a session of Jeff Wouters (Blog | @JeffWouters) about PowerShell. He talked about how powerful PowerShell really is, and showed us some examples. This was my first encounter with the “commandline on steroids”. Back then I didn’t think I’d use that any time soon, but this is changing fast! The wide variety of possibilities that PowerShell offers made me curious, but I never found the time and use for it. Until last week…

A great number of things have changed the last few weeks, and that gave me the opportunity to play around with PowerShell. Starting with reading a tutorial and creating my first few trial scripts, it soon hit me that it isn’t that easy. But on the other side, if you succeed at writing a script, it’s very powerful!

PowerShell ISE
One of the very first things I tried after staring the PowerShell ISE (the “studio” in which you can write your scripts) was trying to find all objects in my database. The first thing you need to do is:

#Set the Execution-Policy to Unrestricted
Set-ExecutionPolicy Unrestricted

#Run the Import-Module below once, so you can use the SQLSERVER functionality
Import-Module SQLPS

This will allow you to access your SQL Server objects. After that, you can start with the easy stuff like:

dir SQLSERVER:\sql\localhost\SQL2012\databases\Sandbox\Tables

dir SQLSERVER:\sql\localhost\SQL2012\databases\Sandbox\Views

dir SQLSERVER:\sql\localhost\SQL2012\databases\Sandbox\StoredProcedures

dir SQLSERVER:sql\localhost\SQL2012\databases\Sandbox\Users

This is very basic, and real easy to find out how this works. From Powershell, the results will look like this:

And from the “PowerShell ISE”, it will look like this:

Another thing I tried is to automatically create a database on my local server:

[void] [System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.ConnectionInfo") 
[void] [System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.SMO") 

$serverInstance = "(local)\SQL2012"
$conn = New-Object -TypeName Microsoft.SqlServer.Management.Common.ServerConnection -ArgumentList $serverInstance 
$server = New-Object -TypeName Microsoft.SqlServer.Management.Smo.Server -ArgumentList $conn

$newdb = New-Object -TypeName Microsoft.SqlServer.Management.Smo.Database($server, "PowerShellToTheRescue")
$newdb.Create()

This script will create a new database called “PowerShellToTheRescue” on your local SQL Server. I assume it’s also possible to create tables in the database, but I haven’t found the time to test that.

PowerShell trough SQL Server Management Studio (SSMS)
Another option is to start PowerShell via a menu in SSMS. This means PowerShell is starting with the current database as default context. So then you don’t have to use the “Import-Module”, and you can start running your query right away. You can find this option by right-clicking your database, and choose “Start PowerShell”:

So, what are you saying?!
This months T-SQL Tuesday post isn’t that impressive, I know! But what I’m trying to say is that if you try PowerShell, you might find that it’s quite powerful and fun.

And thinking about it, I guess you can expect some more PowerShell posts from me in the next few weeks! 😉