Udostępnij za pośrednictwem


Console Hosted proxy to read performance counters from Azure into PowerPivot

 

You can configure Windows Azure Diagnostics to collect the performance counters of your application and store it in the storage account, those counter will be available in the WADPerformanceCounters Table, there are third party tools and some blogs about how to consume this data but I wanted few features.

 

  1. As fast as possible downloading data in parallel
  2. Only a subset of the counters
  3. Able to configure the Time Window

Usually is a large amount of data so I want the ability to download a subset and use PowerPivot and Power View to analyze the data.

The table storage expose a REST API but PowerPivot doesn’t have built in capabilities to authenticate using the Azure Storage Key infrastructure, but you can do it using the .NET framework and the Azure SDK for example this blog entry Querying Azure Perf Counter Data with LINQ.

PowerPivot can consume OData for example see Experiments with OData and .NET have the WFC Data Services which can publish feeds as OData, so I put all this pieces together in a console application which reads the performance counters and make it public locally as a OData Service so PowerPivot can consume it.

The solution code is attached , you will need to configure few things in the settings file

  1. The Port for the local service
  2. The Storage Account name
  3. The Storage Account key
  4. The Time To Collect in minutes (for example 10 minutes)
  5. The End Time in UTC  (or you can use Now as a value to collect the latest data for the period of time you specified in Time To Collect)

It will looks like

   <appSettings>     <add key="SvcPort" value="8080"/>     <add key="StorageAccountName" value="YourStorageAccount"/>     <add key="StorageAccountKey" value="YourStorageAccountKey"/>     <add key="TimeToCollectInMinutes" value="60"/>     <add key="EndTimeUTC" value="Now"/>   </appSettings>
  

Also there is a list of performance counters to download in the file CountersList.txtThe Console Application will start and will show the URL to consume

image

Now you can use PowerPivot to consume that URL

image

The console application will consume the data from Azure WAD for each counter in parallel and present it to Excel as OData, you can see the time that takes to download the data

 

image

Finally you can use Power View to make analysis of the data

image

 

During this experiment I found a few interesting issues, the first one was related to Microsoft.WindowsAzure.StorageClient.StorageClientException: The operation has exceeded the default maximum time allowed for Windows Azure Table service operations.

The exception is caused because the use of parallel threads to consume the data from the Azure Storage and the default configuration of the ServicePointManager in .NET, basically the default value is 2, so it will only use two connections to consume the data, as the code start multiple threads to download counters those connections are queued and get timeout, I found this very interesting blog explaining the situation Understanding MaxServicePointIdleTime and DefaultConnectionLimit and the fix in my code was pretty easy, just add a small line to the start of my application 

 

         static void Main(string[] args)
        {
            ServicePointManager.DefaultConnectionLimit = 48;
  

The second issue was related with a WCF Data Services and Azure SDK using a conflicting reference, which shows the error The type 'System.Data.Services.Common.DataServiceProtocolVersion' exists in both 'c:\Program Files (x86)\Microsoft WCF Data Services\5.0\bin\.NETFramework\Microsoft.Data.Services.Client.dll' and 'c:\Program Files (x86)\Reference Assemblies\Microsoft\Framework\.NETFramework\v4.5\System.Data.Services.Client.dll'.

The error is because the WCF data services uses the reference to Microsoft.Data.Services.Client and the Azure Storage Client uses System.Data.Services.Client, the workaround I used was to separate it in two projects each one with the right dependencies and it did work (I was afraid I would hit some runtime error but that didn happen).

The Visual Studio 2013 code is attached a .zip in this post