Service Logger For Windows Azure Roles Using Table Storage Service

February 8, 2013 — 4 Comments

Logging in Windows Azure can be done through Windows Azure Diagnostics. This solution collects a ton of detailed data that can be hard to parse through. I recently needed a close to real-time trace of what my Roles were doing. My current project has many instances with many independent services running in parallel, resulting in a challenge when I try to trace using Windows Azure Diagnostics. Log4Net and Enterprise Library offer amazing tools to accomplish what I’m after. But they do so with so much detail and data, that we often need to resort to parsing tools and third party applications to extract meaningful information. I needed something quick, lightweight and that didn’t cost too much to operate.

At first, I was trying to follow what my instances were up to using the Windows Azure Compute Emulator. This wasn’t what I was looking for, because local environments don’t run exactly like the production or staging environments on the cloud. I spent a few minutes thinking about logging and costs related to Windows Azure Storage transactions and came up with the solution described below.

The code from this Post is part of the Brisebois.WindowsAzure NuGet Package

To install Brisebois.WindowsAzure, run the following command in the Package Manager Console

PM> Install-Package Brisebois.WindowsAzure

Get more details about the Nuget Package.

A sample project containing the log viewer can be found on GitHub repository ”Windows Azure Logger” .

The Logger is a static class that accumulates log entries and inserts them into a Windows Azure Table Storage Service in batches. Keeping operational costs to a minimum is achieved by inserting entries in batches of 100 per Table Partition.


Adding Log Entries

Logger.Add("ServiceName",
            "EventName",
            "Details");

Logger.Add("Worker", 
           "Start", 
           DateTime.UtcNow.ToString(CultureInfo.InvariantCulture));

Logger.Add("Worker", 
           "Sleep", 
           "for 1 seconds");
                
Logger.Add("WebFront", 
           "Error", 
           exception.ToString());
                
Logger.Add("Worker", 
           "Stop",
           DateTime.UtcNow.ToString(CultureInfo.InvariantCulture));

Persisting Accumulated Log Entries

Persisting accumulated log entries is achieved in two ways. If one of 3 conditions are met the logger will persist. Further more you can force it to persist through code.

Persistence is achieved by satisfying one of these conditions

  • The logger has accumulated 100 messages
  • There is a 20 second gap between the current message and the previous message
  • The Logger is forced to persist through code

Forcing persistence through code

Logger.Persist(true);

As a best practice I strongly recommend forcing the persistence of the log when your  Worker Roles stop and when your Web Roles stop.

public override void OnStop()
{
    Logger.Add("Worker", 
               "Stop", 
               DateTime.UtcNow.ToString(CultureInfo.InvariantCulture));

    Logger.Persist(true);
    
    //This is a delay to allow the service to stop gracefully.
    Thread.Sleep(TimeSpan.FromMinutes(3));
    
    base.OnStop();
}

Log Viewer

2013-01-31_20h10_13

The log viewer is an MVC 4 base page that is refreshed every few seconds. This is quite practical when you want to follow what’s happening in your environments. This page uses the following code to query for the latest entries from the Windows Azure Table Storage Service.

public async Task<ActionResult> Index()
{
    //Add your service names here
    var partitions = new[]
        {
            "Worker",
            "WebFront"
        };

    var entries = await TableStorageLogger.Logger.Get(10, partitions);
    return View(entries.OrderByDescending(e=> e.Created));
}

Be sure to add the service names that you are using when you log to the array of partitions. These will be used when the Logger queries the Windows Azure Table Storage Service for the entries. Service names are used as table partitions. This allows you to read entries for subset of available services helping you concentrate your efforts when debugging.

Configuration

Be sure your Cloud Storage Account Connection String in your Web.Cong or in your Role Cloud Configurations.

2013-02-08_17h24_31

Summary

I have been testing this solution for a while now and have been getting great results. It has helped me identify and fix many performance issues. Using this kind of logging has its drawbacks and its benefits. One of the major benefits I’ve identified so far, is that its easy to read, easy to use and easy to cleanup.

This logger may be too simple for production diagnostics, it it’s great for development purposes! It allows me to monitor my services in near real-time from my computer or other browser enabled devices. I’m able to follow my test runs during my nightly bus rides.

Take some time to test it out, leave your comments and if something is off or missing please submit pull requests on GitHub.

Get the code from https://github.com/brisebois/WindowsAzureLogger

4 responses to Service Logger For Windows Azure Roles Using Table Storage Service

  1. 

    You are storing the logs temporarily in a static variable. Is it safe in web server?

    Like

    • 

      Hi,

      yes its saved in a static variable until the persistance is triggered

      in my worker and web roles i manually trigger a forced persistance when the role is stopped.

      so far i havent missed anything.

      but as you mentionned, if they pull the power then logs are the least of my worries.

      Alex

      Like

  2. 

    Hi

    What are the partitionkey and the row key?
    There is no index on the timestamp. how do you manage search between dates and avoid full table scan

    Like

    • 

      Hi Shimon,

      This logger is getting quite old in Azure Years. I would strongly recommend using the Azure Diagnostics included in recent SDKs.

      In terms of this code, the RowKeys are ordered in descending order based on the following

      RowKey = string.Format(“{0}-{1}”, DateTime.MaxValue.Subtract(Created).TotalMilliseconds.ToString(CultureInfo.InvariantCulture), Guid.NewGuid());

      to filter by time spans, you can use a startswith method that is described in this post https://alexandrebrisebois.wordpress.com/2014/10/30/azure-table-storage-using-startswith-to-filter-on-rowkeys/

      in order to get this to work, you will need to flip the upper and lower bounds of your seach and you will need to calculate the MAX DateTime minus the DateTime you are looking for.

      Like

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s