For Those Building New Apps in the Cloud

Azure Cloud on Ulitzer

Subscribe to Azure Cloud on Ulitzer: eMailAlertsEmail Alerts newslettersWeekly Newsletters
Get Azure Cloud on Ulitzer: homepageHomepage mobileMobile rssRSS facebookFacebook twitterTwitter linkedinLinkedIn

Azure Authors: Yeshim Deniz, Elizabeth White, Pat Romanski, Zakia Bouachraoui, Nick Basinger

Related Topics: Cloud Computing, Data Services Journal, Azure Cloud on Ulitzer, Microsoft Developer, CIO/CTO Update, VITO Report

Blog Feed Post

Working with Table Storage on the Windows Azure

Table storage, under the hood, is exposed as an ADO.NET Data Service

Azure Cloud on Ulitzer

If you've been working with Azure for a while then you've probably spent some time using the StorageClient sample that came with previous versions of the SDK. With the November 2009 release of the SDK (the one they'll be using at PDC 2009), they have wrapped that sample up into the SDK and refactored it to fit more in line with the conventions and quality standards of a Microsoft API. As a result, some of your code will break (but not much). Queue storage and Blob storage (discussed in upcoming posts) actually have more breaking changes than table storage.

Table storage, under the hood, is exposed as an ADO.NET Data Service (formerly Astoria). As a result, if you've used the System.Data.Services.Client library before, you've already got a leg up in interacting with Azure Storage.

When you're working with table storage, there are a few things that you're going to need. Once you've got these, you're good to go:

  • References to System. Data. Services.Client and Microsoft. WindowsAzure. StorageClient (obviously you also need a reference to service runtime if you're hitting table storage from within the cloud itself... remember that you can hit table storage from the desktop too, e.g. from WPF applications).
  • Credentials. There have been some changes to the way storage client credentials work that are beyond the scope of this post, but you can still use the same accountname/account shared key pattern that you used in the past.
  • A DataServiceContext. You're going to need this to interact with the tables in table storage. As you'll see in the code below, the pattern is to create your own context that derives from the base and exposes your tables as IQueryables. If you've ever worked with ADO.NET Data Services or Entity Framework before, this pattern should also look familiar.
  • Entity objects. Every table that you have in table storage contains arbitrary columns. In other words, if you really wanted, you could have a different schema for every row in your table. However, to work with it using the Data Services client, each row needs to conform to a fixed schema - this fixed schema you'll represent with a regular C# class that contains the necessary partition key and row key properties. This class also needs a parameterless constructor (required by the data services client to reconstitute instances of that class from the HTTP results)
  • The cloud table client. This new class will let you create tables and test for the existence of tables. You do not need to use this class for querying table storage, it's more of an administrative class for dealing with table storage itself.

The first thing we're going to want to do is get the credentials. The new SDK allows us to dynamically determine if we're running in a fabric or running as a standalone app (which allows us to build apps that we can run on-premise OR in the cloud!). Here's some code I used to get the configuration settings for the account name and shared key:

string accountKey = ConfigurationManager.AppSettings["AccountSharedKey"];
string tableBaseUri = ConfigurationManager.AppSettings["TableStorageEndpoint"];
if (RoleEnvironment.IsAvailable)
accountName =
accountKey =

Once you've got the account key and the account name, you can get an instance of the storage credentials and table client classes:

StorageCredentialsAccountAndKey creds =
new StorageCredentialsAccountAndKey(accountName, accountKey);
CloudTableClient tableStorage = new CloudTableClient(tableBaseUri, creds);
CustomerContext ctx = new CustomerContext(tableBaseUri, creds);

Using the table storage class, we can create a new table (if it doesn't already exist):

if (tableStorage.CreateTableIfNotExist("Customers"))
CustomerRow cust = new CustomerRow("AccountsReceivable", "kevin");
cust.FirstName =
cust.LastName =
"Customers", cust);

Here I'm also using my customer context class and my customer row class (will show those shortly) in order to put a new customer into table storage. Note my use of an application name for the partition key and the username for the row key. Entire chapters of books can (and will) be written on strategies and patterns for using partition and row keys.

Now let's say that we're inside an MVC 2 controller and we want to make the list of customers available to the view. If we're not doing a strongly typed view (which we should be doing unless we can't help it...) then we can use code that looks like this:

CustomerRow[] customers = ctx.Customers.ToArray();
"Customers"] = customers;

Now let's look at the CustomerContext class:

public class CustomerContext : TableServiceContext
public CustomerContext(string uri, StorageCredentials creds) : base(uri, creds) { }
public IQueryable<CustomerRow> Customers
return this.CreateQuery<CustomerRow>("Customers");

The CustomerRow class is just a POCO class that has a default constructor and a constructor that takes a partition key and a row key, and inherits from the TableServiceEntity class.

public class CustomerRow : TableServiceEntity

string firstName;
private string lastName;
private string userName;
private string applicationName;

public CustomerRow(string applicationName, string userName)
base(applicationName, userName)
ApplicationName = applicationName;
UserName = userName;

public CustomerRow() : base() { }

I snipped out the rest of the class for brevity - I'm assuming we've all seen stock property accessors before. At this point you should be ready to roll using table storage. There is also one other benefit they gave us in November 2009 CTP - you no longer need to pre-rig your database schema in your SQL 2008 database!! The new development storage simulator accurately simulates the dynamic schema nature of the actual table storage in the cloud. I can't begin to describe how many headaches this alleviates.

Enjoy table storage on the new Nov 2009 CTP and I'll be posting similar blog posts about the new Queue storage and Blob storage clients shortly!

Read the original blog entry...

More Stories By Kevin Hoffman

Kevin Hoffman, editor-in-chief of SYS-CON's iPhone Developer's Journal, has been programming since he was 10 and has written everything from DOS shareware to n-tier, enterprise web applications in VB, C++, Delphi, and C. Hoffman is coauthor of Professional .NET Framework (Wrox Press) and co-author with Robert Foster of Microsoft SharePoint 2007 Development Unleashed. He authors The .NET Addict's Blog at .NET Developer's Journal.