Arrow icon
Back to Insights

Schedule Azure BLOB storage backups with Azure Functions

December 12, 2017
Arra Derderian

Cloud Construct has been working with Windows Azure for close to nine years now. Azure offers a variety of platform capabalities as well as tools to easily manage your web infrastructure. Some of the key featues that we need when hosting a client's site is the ability to back-up your SQL Azure database, take back-ups of your web root, and also your BLOB storage data. As of right now it is fairly straight forward to backup your SQL Azure data via Azure Recovery Vault services and Long Term Backup. This lets you back-up your data for up to 10 years. For the web root, you can use the Standard hosting plan and configure automatic backups if you wish. The last piece has always been a little of a trick because Azure already backs up your data to sister data centers, but this does not protect you from a mistake or accidental deletion. BLOB snapshotting can be complicated and not truly effective for something like a CMS. Cloud Construct often builds sites on Orchard CMS. Orchard is a free open source CMS built on the Microsoft .NET framework and enables hosting on Windows Azure. It is common practice to store your CMS "media" on Windows Azure using the Blob Storage Module. Keeping a back-up of important website data is crucial to maintaining a solid hosting infrastructure.

Azure Functions to Backup BLOB Storage

Azure offers a great tool call Azure Functions. Functions allow you to run code snippets in various languages like JS and C#. This code is executed at various key integration points such as a Timer, Storage Trigger, or Queue Trigger. This is ideal in order to perform scheduled back-ups as needed. Wiring a new Timer triggered job is the first step to setting up a scheduled back-up of website data.

Step 1 - Create a new Azure Function App

Browse to the Azure Portal and choose "New" from the left hand menu. Choose "Compute" and then select "Function App". You can then fill out the details needed to host the Function based on your own hosting infrastructure.

Performing this step will create a new Storage account associated to the Function as well, which you can use as your storage location for backups.

Step 2 - Create a new Function called "CopyBlobs" with a Timer trigger.

Once the new Function app is provisoned you can now add a new Function called "CopyBlobs".

Once setting up the new Function use the new Template of Timer Trigger.

Step 3 - Create a new file to import the necessary Storage Library dlls.

You will want to add a new file called "project.json" with the content below. This allows the proper libraries to be imported for your code to reference. In order to add the new file you just click the "View Files" link on the right side of the screen and expand the file explorer.

Step 4 - Add the proper code to the run.csx file.

Below is a snippet of code you can use to copy all blob data from one container to another from separate storage accounts. This is important because we will want to store them separately for an added layer of protection. Copy the code and click save after replacing the "YOURACCOUNTNAME" and "YOURKEY" strings with the source and destination accounts you want to use. You will notice that the container I am copying is called "Media". Once complete click "Save" and then click "Run" to verify the task completed properly. You should notice the new container in your destination storage account and all the files copied in recursively.

using System;
using System.IO;
using Microsoft.Azure;
using Microsoft.WindowsAzure.Storage;
using Microsoft.WindowsAzure.Storage.Blob;

public async static void Run(TimerInfo myTimer, TraceWriter log)
{
log.Info($"C# Timer trigger function executed at: {DateTime.Now}");

await CopyBlobs(log);
}

private async static Task CopyBlobs(TraceWriter log)
{
var keySource = "YOURKEY";
var connectionStringSource = $"DefaultEndpointsProtocol=https;AccountName=YOURACCOUNTNAME;AccountKey={keySource}";
CloudStorageAccount storageAccountSource = CloudStorageAccount.Parse(connectionStringSource);

// Create the source blob client
CloudBlobClient blobClientSource = storageAccountSource.CreateCloudBlobClient();
CloudBlobContainer containerSource = blobClientSource.GetContainerReference("media");

var key = "YOURKEY";
var connectionString = $"DefaultEndpointsProtocol=https;AccountName=YOURACCOUNTNAME;AccountKey={key}";
CloudStorageAccount storageAccount = CloudStorageAccount.Parse(connectionString);

// Create the destination blob client
CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
CloudBlobContainer container = blobClient.GetContainerReference("media-production-backup");

// Create the container if it doesn't already exist.
try
{
await container.CreateIfNotExistsAsync();
}
catch(Exception e)
{
log.Error(e.Message);
}

log.Info("Starting Copy");

try{

foreach(var blob in containerSource.ListBlobs(useFlatBlobListing: true))
{
Uri thisBlobUri = blob.Uri;
var serverBlob = blobClientSource.GetBlobReferenceFromServer(thisBlobUri);

CloudBlockBlob targetBlob = container.GetBlockBlobReference(serverBlob.Name);
targetBlob.BeginStartCopy(thisBlobUri, null, null);

log.Info($"BlockBlob destination name: {targetBlob.Name}");
}

log.Info("Copy completed");

}
catch(Exception ex){
log.Error(ex.Message);
log.Info("Copy failed");
}
finally{
log.Info("Operation completed");
}
}

Summary

To summarize what we have done here, we scheduled a new Azure Function to recursively copy BLOB storage data from one storage account to another. This is helpful when we want to backup BLOB storage data to another account for a safe back-up. Cloud Construct is a digital agency in Boston, MA and specializes in Azure, .NET, and Orchard CMS development.

Author Photo
Arra Derderian
Founder & Chairman
Arrow icon
Back to Insights

Let's talk about your project

Drop us a note
Arrow
Or call us at:  
1.617.903.7604

Let's talk about
your project

Drop us a note
Arrow
Or call us at:
1.617.903.7604

Let's talk about your project

Drop us a note
Arrow
Or call us at:  
1.617.903.7604