Azure Service: Moving Storage from Development (local) to the Cloud
As part of the process of moving my Azure test service from running entirely locally (on what's called "Development Fabric" and "Development Storage", i.e. for development on your local machine) to the Cloud, I followed the steps on MSDN which suggest moving to the Cloud in two steps:
- Run your service locally (on "development fabric") connecting to cloud-based storage.
- Once that's working, move your service to the cloud fabric.
So how do you get your local service to talk to cloud-based storage? Turns out the hardest part is getting the storage table created on the Cloud (or is it "in the Cloud"? :))
The old way suggested having a method in your DataContext which you call to check if the tables are created and if not create them. But a more recent post from Steve Marx points out that this is very inefficient. He proposes adding a new entry point to do this, but another post from Mark Seemann proposed an even more elegant solution: write a Powershell script to invoke the method to create the table(s) once out of band of any running service code. Since I'd been meaning to play around a bit with Powershell anyway, this approach had an extra benefit.
Using Mark's script as a model, I eventually got this working and now have my service connected to the cloud storage.
The changes I had to make were:
- Make sure your Data Context class which inherits from the StorageClient TableStorageDataServiceContext class is public. Otherwise, the PowerShell script won't be able to load it.
- Make sure it has two constructors (here are mine:
public FerryContext() : base() { }
public FerryContext(StorageAccountInfo info) : base(info) { }
- These are needed by the PowerShell script but not by my previous service code.
- The other customizations are specified in Mark's blog post referenced above.