Skip to main content

Backup SQL Database using PowerShell in an Isolated Environment

· 6 min read
Naw Awn

There may be the time when you need to backup your SQL Express Database on a scheduled basis and the server is running on a locked down environment with no access to the internet. In this blog post I wanted to share how I work around it using the PowerShell script to backup the Database and copy the content to the another server then back it up to the Azure Recovery Service.

The Scenario:

  • The server is an SQL Server Express Edition.
  • It is a standalone workgroup (non-domain joined) server.
  • It is running on an isolated network with no internet access.
  • SMB ports are also blocked between the isolated SQL LAN and the Server LAN

The Objectives:

  • Backup the required Database daily
  • Upload the database backup file to Azure Recovery Service

The Solution:
There are many different ways to achieve the same result. However, this is the way I happened to do it.

  1. Set up a VM on the Server LAN that has internet access
  2. Install Microsoft Azure Recovery Service (MARS) Agent and set up the Files and Folders Backup
  3. Set up an Sftp server on the same VM
  4. Create a local credential for Sftp connection
  5. Write the PowerShell script to run on the SQL server
  6. Set up a scheduled task to run the PowerShell script

Infrastructure

Since it is an SQL server express edition, you cannot set up the scheduled backup from the SQL Server Management Console. This has to be done from the Windows Task Scheduler and you will need to use either sqlcmd script or PowerShell script. In case of sqlcmd script, you will also need to install the sqlcmd Utility beforehand. Alternatively, you could just use the PowerShell v5 that comes with the OS for your PowerShell Script. I also try to minimise the usage of AD credential when the backup files (.bak) are copied across.

So, I started writing the individual functions which would do a specific task. Then use those functions in the main script. The below is the script which runs on the SQL Express Server. The script requires dbatools and Posh-SSH PowerShell Modules.

Backup-MSSqlDb.ps1
#Requires -Modules DbaTools, Posh-SSH
$here = (Split-Path $MyInvocation.MyCommand.Path)

Function Backup-SqlDb{
Param(
[Parameter(Mandatory)][String]$SqlInstance,
[ValidateNotNullOrEmpty()][String[]]$Database,
[String]$Path,
[PSCredential]$SqlCredential
)
Begin{
$SqlParam = @{}
$SqlParam['SqlInstance'] = $SqlInstance
$SqlParam['Database'] = $Database
If ($null -ne $Path){ $SqlParam['Path'] = $Path }
If ($null -ne $SqlCredential){ $SqlParam['SqlCredential'] = $SqlCredential }
}
Process{
Backup-DbaDatabase @SqlParam
}
}

Function Upload-SftpFile{
Param(
[Parameter(Mandatory, ValueFromPipeline)]
[String[]]$FilePath,
[String]$ComputerName,
[String]$Username,
[SecureString]$Password,
[String]$RemotePath
)
Begin{
Write-Verbose "Initializing Sftp Session..."
$Cred = New-Object System.Management.Automation.PSCredential($Username,$Password)
$Sftp = New-SFTPSession -ComputerName $ComputerName -Credential $Cred -AcceptKey
}
Process{
If(-Not($Sftp)){
Write-Warning "Unable to establish Sftp session!"
return
}
Write-Verbose "Uploading file to the Sftp Server..."
Foreach($File in $FilePath){
Write-Verbose "`t $File"
Set-SftpFile -SFTPSession $Sftp -RemotePath $RemotePath -LocalFile $File
}
}
End{
If ($Sftp){
Write-Verbose "Closing Sftp Session..."
Remove-SFTPSession -SFTPSession $Sftp | Out-Null
}
}
}

Function Remove-SqlBackup{
Param(
[Parameter(Mandatory)][Int]$Day,
[ValidateNotNullOrEmpty()][String]$FolderPath
)
Process{
If ($Day -gt 0){
Write-Warning "The number of Day should be in negative."
Return
}
$FileList = (Get-ChildItem -Path $FolderPath) | Where-Object{$_.LastwriteTime -lt (Get-Date).AddDays($Day)}

If ($FileList) {
(Get-Date) | Add-Content -Path $FolderPath\DeletionLog.txt
$FileList | Add-Content -Path $FolderPath\DeletionLog.txt
$FileList | Remove-Item -Force -ErrorVariable errLog
If ($errLog) {
$errLog | Add-Content -Path $FolderPath\DeletionLog.txt
}
}
}
}

Function UnProtect-Base64{
[OutputType([String])]
Param( $Text )
Process{
return ([system.Text.Encoding]::UTF8.GetString([Convert]::FromBase64String($Text)))
}
}

#Main Function
Function Backup-MSSqlDb{
Param(
[Parameter()]
[String]$ConfigFilePath,
[Switch]$UseSftp
)
Process{
Write-Verbose "Importing config file..."
If (-Not(Test-Path -Path $ConfigFilePath)){
Write-Warning "Configuration File not found!"
return
}
$Config = Import-PowerShellDataFile $ConfigFilePath

Write-Verbose "Removing Old backup files..."
If(($null -ne $($Config.Cleanup.Day)) -And ($null -ne $($Config.Cleanup.FolderPath))){
Remove-SqlBackup -Day $($Config.Cleanup.Day) -FolderPath $($Config.Cleanup.FolderPath)
}

Write-Verbose "Backing up the Sql Database..."
$BackupParam = @{
SqlInstance = $($Config.SqlBackup.SqlInstance)
Database = $($Config.SqlBackup.Database)
}
#If username and password are provided for the SQL server, create a new credential
If(-Not(([String]::IsNullOrEmpty($($Config.SqlBackup.Username))) -And ([String]::IsNullOrEmpty($($Config.SqlBackup.Password))))){
$SecureStr = UnProtect-Base64 -Text $($Config.SqlBackup.Password) | ConvertTo-SecureString -AsPlainText -Force

$BackupParam['SqlCredential'] = New-Object System.Management.Automation.PSCredential($($Config.SqlBackup.Username),$SecureStr)
}
If($null -ne $($Config.SqlBackup.Path)){
$BackupParam['Path'] = $($Config.SqlBackup.Path)
}
Try {
$OutputFile = Backup-SqlDb @BackupParam
}
catch {
throw $error[0].exception.Message
}

#Upload the backup files to Sftp Server
If($UseSftp){
$UploadParam = @{
FilePath = $($OutputFile.Path)
Username = $($Config.Sftp.Username)
Password = $(UnProtect-Base64 $($Config.Sftp.Password) | ConvertTo-SecureString -AsPlainText -Force)
RemotePath = $($Config.Sftp.RemotePath)
ComputerName = $($Config.Sftp.ComputerName)
}
Upload-SftpFile @UploadParam
}
}
}

Backup-MSSqlDb -ConfigFilePath $here\Backup-Config.psd1 -UseSftp -Verbose

When I write the scripts, I tend to make it reusable and adaptable to a new environment, in case you need to redeploy it on a different server in the future. So, Instead of having a bunch of variables at the top of the script, I would create a configuration data file using psd1 format. See below for the actual config file.
The value of the password text must be encrypted with Base64 before entering it on the config file. The script will then decrypt the value and turn it into a credential object during the process. This is just to avoid having the plain text password stored on the configuration file.

Backup-Config.psd1
@{
SqlBackup = @{
#Username and Password are optional if the logged on user has the necessary right.
Username = ''
Password = ''
Database = @(ExampleDB,NorthWinds)
SqlInstance = 'ServerName\InstanceName'
#Path = 'Optional - New backup path other than the default the location'
}
Sftp = @{
#Remote Path need to use forward slash "/" instead of backslash "\"
Username = ''
Password = ''
RemotePath = '/D:/Data/SQLBackup'
ComputerName = 'SftpServerName'
}
Cleanup = @{
#Delete local backups older than the number of 'Day' specified
Day = '-7'
FolderPath = ''
}
}

I won't be going through how the MARS agent was set up to run the backup. These steps are quite straight forward and easy to follow from the Azure Backup - Microsoft Docs. And the instruction to set up an SFTP server on Windows Server using OpenSSH is also available on the OpenSSH - Microsoft Docs.

Sftp Service uses port 22. So, this will need to open up between the two network segments. The last thing you need to do it to set up the scheduled task on the Sql Server.

Having gone through the process above, you could still install the Azure Backup Agent directly on the machine, open up the network to have access to the internet or use the proxy server to go out to the Azure Recovery Service if your network permits.

However, the above exercise was just one of the feasible solutions since there are servers already available, running MARS agent and backing up Files and Folders to Azure Backup. I hope the script will come in handy if you come across similar scenario.