Freigeben über


Automating Blob Uploads, Part 2

I tossed out a challenge in a comment I left on the first of this series – there’s a problem with this approach to automating blob uploads.  It’s not a problem that affects everyone.  For example, perhaps you have a multi-tenant solution where each customer has their own storage account. 

When a Windows Client wants access to the Azure Storage Service directly, the storage credentials must be available locally in order to structure the REST requests.  Usually, as in the code that I’ve provided, the credentials are stored in clear text in the configuration file.  However, if the client is part of a hybrid solution and many customers will have it, you probably don’t want to do this.  These credentials are the keys to the kingdom – with them anything can be done with any of the data in the storage account.

One solution to this problem is to put the credentials behind a web service call.  SSL encrypt the call to the web service and authenticate callers with the AppFabric Access Control Service (ACS) or some other means, then pass the storage credentials back to the client.  Without going into the details of encryption/decryption, certificates, etc., this solution is the same as the one that I have already provided with very minor tweaks to the code.

Another solution is the one for which I am providing additional sample code.  Use a Shared Access Signature (SAS) to protect blob storage access and put the accesses to queue storage behind secured and authenticated web service calls.  In a nutshell, a SAS is a time-bombed key with specific access rights to either blobs in a container or a single blob.

In this modified solution, I have added a WCF web role to my Azure service.  This WCF web role is based on the Visual Studio template that is listed along with web role, worker role, etc, when creating a new Cloud Solution.

image

The flow of control is altered only slightly:

  1. Prior to uploading the blob, a call is made to request the SAS from the web service.  The SAS is a string.  This string is supplied as a parameter on the upload calls.
  2. Instead of accessing queue storage directly, additional calls were added to the web service:
    1. Send notification message
    2. Get acknowledgement message
    3. Delete acknowledgement message

While this solution solves some security concerns, there is again a cost.  The client cannot run without the server in this architecture.  I have lost loose coupling in favor of security.  In my third article on this topic I’ll partially deal with this problem.  (Hint: AppFabric Service Bus Queues)

Full source is provided here.

Enjoy!