Freigeben über


Get logging in Windows Azure with Enterprise Library

Hi again – yes I know it’s been a while. Recently I’ve started a new role in Microsoft which involves helping customers deploy applications on Windows Azure, Microsoft’s cloud computing platform. I thought it may be fitting it I kick this off with a post that bridges my (now quite) old role with my new one and talk about using Enterprise Library with Windows Azure.

One of the great things about Windows Azure is that it includes the .NET Framework in its entirety – so for the most part, .NET code you’ve written in the past will run fine in the cloud. However there are some important differences between your own servers and the cloud that you need to be aware of as a developer or designer. For example, while it is technically still possible to log to a flat file or the Windows Event Log, it’s generally impractical to access the resulting logs. As a result, logging and instrumentation is one of those things which you need to do a little differently in the cloud.

Using Windows Azure Diagnostics

It so happens that Windows Azure includes its own diagnostics infrastructure. It’s well documented elsewhere (try the official documentation or this MSDN Magazine article) but in a nutshell it involves high-performance buffers built on ETW which collect log events from your application. These events, along with other diagnostic sources such as event logs and performance counters, can be transferred to Windows Azure Storage (tables and blobs) either at predefined intervals or on-demand.

The most commonly described approach to logging to Windows Azure Logs is to use the standard System.Diagnostics Trace class as follows:

 Trace.TraceInformation("Trace works fine, but you're somewhat limited...");

Next, you’ll need to configure System.Diagnostics to log to Windows Azure Diagnostics via the DiagnosticMonitorTraceListener:

 <system.diagnostics>
  <trace autoflush="true">
    <listeners>
      <add type="Microsoft.WindowsAzure.Diagnostics.DiagnosticMonitorTraceListener, 
          Microsoft.WindowsAzure.Diagnostics, Version=1.0.0.0, Culture=neutral, 
          PublicKeyToken=31bf3856ad364e35" name="AzureDiagnostics" />
    </listeners>
  </trace>
</system.diagnostics>

Finally, you’ll need to configure and start Windows Azure Diagnostics. In this example, let’s ask it to transfer all logs to Windows Azure Storage (specifically, the WADLogsTable table) every 5 minutes by modifying the WebRole or WorkerRole class as follows:

 public override bool OnStart()
{
    StartDiagnostics();
    RoleEnvironment.Changing += RoleEnvironmentChanging;
    return base.OnStart();
}

private void StartDiagnostics()
{
    // Get default initial configuration.
    var config = DiagnosticMonitor.GetDefaultInitialConfiguration();

    config.Logs.ScheduledTransferLogLevelFilter = LogLevel.Undefined;
    config.Logs.ScheduledTransferPeriod = TimeSpan.FromMinutes(5);

    // Start the diagnostic monitor with the modified configuration.
    DiagnosticMonitor.Start("DiagnosticsConnectionString", config);
}

Of course, you’ll also need to make sure you define a connection to your Windows Azure Storage account in your ServiceConfiguration.cscfg file.

Bringing in the Logging Application Block

There’s nothing wrong with any of the above code, but if you’re used to using Enterprise Library’s Logging Application Block for your logging needs, you’ll soon find that you’re somewhat limited. For example, you won’t automatically get access to things like the Process ID and machine name, you won’t get the ability to customise your message format and you won’t get the advanced routing and filtering options. The great news is that since the Logging Application Block is built on System.Diagnostics, you can use it in your cloud applications and configure it to write to Windows Azure logs.

First, let’s look at your logging code. There’s nothing special you need to do here and any existing EntLib logging code should work– since the Logging Application Block’s API is abstracted from the chosen logging mechanism you just log like you always did, either using the old school Logger facade or the new fangled DI alternatives. Make sure you choose an appropriate Severity level, as both Enterprise Library and Windows Azure Diagnostics can be configured to use this to filter messages or decide which ones to transfer to storage.

 Logger.Write("Get Logging in Windows Azure with Enterprise Library!", 
    "General", 1, 0, System.Diagnostics.TraceEventType.Information);

Now, let’s look at your configuration file. You can delete the <system.diagnostics> section we described earlier and use Enterprise Library’s configuration instead. You can use either the config tool or an XML editor to edit the config, but the important thing is to choose a System.Diagnostics Trace Listener and configure it to use the same DiagnosticMonitorTraceListener we used before. I’ve included just the <listeners> section of the Logging Application Block’s configuration below; in your app you can continue to use whatever combination of sources, formatters, filters and listeners you like.

 <listeners>
  <add listenerDataType="Microsoft.Practices.EnterpriseLibrary.Logging.Configuration.SystemDiagnosticsTraceListenerData, Microsoft.Practices.EnterpriseLibrary.Logging, Version=5.0.414.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35"
    type="Microsoft.WindowsAzure.Diagnostics.DiagnosticMonitorTraceListener, Microsoft.WindowsAzure.Diagnostics, Version=1.0.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35"
    name="Azure Diagnostics Trace Listener" />
</listeners>

Finally, you’ll still need to configure and start the Diagnostic Monitor when your role starts. The code in the first example will continue to do the trick (although keep in mind you may prefer to configure it differently by applying log transfer filters or choosing to transfer logs to storage only on-demand).

You should now be ready to deploy your application to the cloud and see it in action. Remember, with Windows Azure Diagnostics, the log events will only be transferred to Windows Azure Storage after the specified interval or on-demand (requiring additional code or Powershell scripts), so you may need to be more patient than you’re used to. Once the transfer is complete, you should see the events in your WADLogsTable table. The easiest way to view the data in the table is to use Visual Studio 2010’s Server Explorer, but there are many other web- and Windows-based tools that can do the same and more. The data in the log files will have a little bit of XML padding, but you’ll see your formatted data sitting proudly in the middle:

 <TraceSource>General</TraceSource>
<Object>Timestamp: 9/6/2010 5:05:38 AM
Message: Get Logging in Windows Azure with Enterprise Library
Category: General
Priority: 1
EventId: 0
Severity: Information
Title:
Machine: RD00155D3134BD
App Domain: /LM/W3SVC/1273337584/ROOT-1-129282230874485863
ProcessId: 1724
Process Name: d:\windows\system32\inetsrv\w3wp.exe
Thread Name: 
Win32 ThreadId:2560
Extended Properties: </Object>

To summarise, while Windows Azure does require you to do a few things a little differently, most of your existing code and favourite techniques should still work fine. With just a small configuration change and some startup code, your Enterprise Library Logging code will feel right at home in the cloud.

Comments

  • Anonymous
    September 05, 2010
    great work

  • Anonymous
    September 23, 2010
    This is great Tom, really consolidated it all for us on our project.  I read your article you posted circa 2006 around using external configuration files for the ent lib configuration information.  Seems to me it would be really useful to be able to store say the logging filters, categories, levels, e.g. verbose etc somewhere other than in the web.config which would mean one wouldn't have to redeploy one's app to the cloud just to switch logging in or out.  It seems there isn't a way store it in the ServiceConfiguration.cscfg file in the CloudService role, so perhaps storing it somehow in Cloud Storage would make it dynamically configurable, especially as deploys take so long cheers again for the continued insights Colin