Russ Thomas – SQL Judo

The Art of SQL Server Database Administration, Development, and Career Skills for the Technically Minded

T-SQL Tue #50 Log Shipping With Powershell – A Better Mousetrap?

T-SQL TuesdayThis month the Tsql2sDay topic is automation – props to our host SqlChow for the topic.  Props to Adam Machanic for the rolling party.

Post Disclaimer:  Normally if there is a process that Microsoft has implemented – I’m not a big proponent of rolling my own.  I would never, for example, decide to write my own merge replication approach (I’ve seen it), or full text indexing scheme (uh, ok, tried it).

About a year ago Matt Hubble (twitter: @MSHubble) opened my eyes to new possibilities regarding log shipping with powershell.  I had fiddled with PS but had never moved a solution to production.  I came to my DBA career via a development path – so I tend to look at anything that can’t be handled by SQL Server as a job for C#.  Matt’s solution not only met a critical production need but opened my eyes to the simple elegance of PS scripting.

Log Shipping

Setting The Stage

The group was shopping 3rd party solutions for managing backups.  We were considering features such as improved compression,  management of policy, automated monitoring, reporting and auditing needs.

The problem?  Several databases also required log shipping to DR.  Many 3rd party backup providers support log shipping, but the database in question was also participating in a High Availability failover mirror setup.  We needed our log ship to DR to continue without interuption.

In short, our list of log shipping needs looked like this:

  1. Must survive failover from one mirror to another
  2. Must not break during network blips
  3. Must support third party compressed backups
  4. Must clean up after it’s self
  5. Must self monitor

Microsoft’s native offerings and the third party solutions all came individually close to meeting these needs – but none to the satisfaction of the team.

Enter Matt’s PowerShell solution.

When I asked permission to use his script for this topic he began immediately talking about it’s shortcomings.  “It needs better error handling, more elegant logging, etc. etc. etc.”  It may be without a lot of flair – but the concept has proven sound month after month.  Matt’s got a great mind for production.  Knows what it takes to work – isn’t afraid to roll it out when it’s solid – but continually works towards a more perfect way as priorities allow.

The Solution

This was the resulting powershell implementation.  The script runs as an agent job on a 15 minute interval.  It starts by setting up environment variables.  Take note of the log backup share.  Both the primary and failover partner use the same share location for backups.  In other words, PS is happily ambivilent as to which partner is currently primary.  Problem one solved.

#####################################################
# NAME: Log shipping restore with grooming
# AUTHOR:  @mshubble
# COMMENT: Restores log backups in order of 
#    creation time and grooms those that are 
#    restored and older than retention period
#####################################################

# Variables
$databaseName = "ImportantData"
$logBackupFolder = "\\Important_Share\LogBackups" # no trailing slash
$tempLogFolder = "V:\LogBackup\ImportantData" # no trailing slash
$undoLocation = "I:\ImportantData" # Standard location: transaction log folder, no trailing slash
$retentionPeriod = 8 # in days
$winUserName = "domain\service_account"
$winEncryptedPW = "SOMECRAZYLONGGUID"

At this point we make sure the locations needed are accessible.  If not, the error is noted for the log and the process will try again in another 15 minutes.

## Make sure required files/folders are accessible
if(!(Test-Path $logBackupFolder)) { throw "Log folder inaccessible" }
if(!(Test-Path $tempLogFolder)) { throw "Temp log folder inaccessible" }
if(!(Test-Path $undoLocation)) { throw "Undo location inaccessible" }

We now set the undo location (for standby mode) and begin a loop to search the backup share.

This sample script is setup to run with Idera’s SQL Safe product.  If you look at the conditions on the foreach loop you’ll see that it filters out anything not ending in .safe.  Once the loop restores a backup it simply changes the file extension to .done.  Sorting by creation time ensures we always get the next .safe file in line.

Notice the loop will process all backups supplied since the last run.  If it missed a couple runs due to network outages or stalled backups – no biggie – it’ll catchup during the next run.  Requirement three – check.

## Begin log restore
# Get unrestored log files in backup folder
$undoLocation = [System.String]::Concat($undoLocation,"\%instance%_%database%_%backuptype%_%utctimestamp%.undo")
foreach ($logName in (Get-ChildItem $logBackupFolder | Where {$_.Extension -eq ".safe"} | Sort-Object CreationTime))

Now for the magic within the loop.

File by file it copies the log backup from the network share to the local DR site, then applies the backup using the 3rd party command line syntax.  Idera is shown but any product with command line support would work.  Support for 3rd party – check.

Then, as mentioned above, renames the file to .done keeping the job from attempting to reapply on future runs.

    {
    $logBackupPath = [System.String]::Concat("`"",$logName.FullName,"`"")
    $oldLogBackupPath = $logName.FullName
    $newLogBackupPath = [System.String]::Concat($logName.FullName,".done")
    $tempLogBackupPath = [System.String]::Concat($tempLogFolder,"\",$logName.Name)

    Copy-Item $oldLogBackupPath $tempLogBackupPath
    # Restore SQLsafe log
    & "C:\Program Files\Idera\SQLsafe\SQLsafeCmd.exe" Restore $databaseName $templogBackupPath -DisconnectUsers -RecoveryMode Standby -UndoFile $undoLocation -WindowsUsername $winUserName -EncryptedWindowsPassword $winEncryptedPW -NoPrompt
    # Rename log file if restored successfully
    If ($lastexitcode -eq 0)
        {
        Remove-Item $tempLogBackupPath
        Rename-Item $oldLogBackupPath $newLogBackupPath
        }
    }

With all available log’s restored we can now clean up per our retention policy.  Number four – check.

## Begin log grooming
$retentionDay = [DateTime]::Now.AddDays(-$retentionPeriod)
# Find restored logs older than $retentionPeriod
foreach ($file in (Get-ChildItem $logBackupFolder) | where {($_.CreationTime -le $retentionDay) -and ($_.Extension -eq ".done") } )
    {
    Remove-Item $file.FullName 
    }

The only requirement left on the table is for the script to keep us informed any time it’s having problems.

For this we setup a second SQL Agent script that queries the status of the most recently restored job; a simple ranking querying against the restorehistory table within msdb to identify most recent restore. If anything is older than tolerances we define, it lets us know via dbmail.


This solution re-iterates to me that whatever your flavor of database employment is – you should add some type of scripting, coding, logic to your bag of tricks.  The tasks you’d like to accomplish won’t always have an off the shelf solution or slick GUI you just click next, next, next on.  Even solutions that do might occasionally be more efficiently implemented another way.

Image Credits:  Flickr graeagleworld old time log truck.

2 comments on “T-SQL Tue #50 Log Shipping With Powershell – A Better Mousetrap?

  1. Thomas Fraefel
    January 7, 2015

    Hi,

    looks nice – is it possible to download the full scripts anywhere ? or did I fail to see the link somewhere ?

    Thx in advance.

    /T

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

Information

This entry was posted on January 14, 2014 by in Career Skills.
%d bloggers like this: