Archives For November 30, 1999


How do you build features on #Azure?

For the past few months my team and I have been using git-flow to build a new REST API for an existing cloud native application. The challenge was impressive, but this is how we did it.

Our team is spread across many geographical locations. We use a mix & match of best practices to help us communicate. For example, we use Skype, it’s on all our devices (phones, tables, laptops and desktops…) and makes collaboration easy! Our code lives in Git and is continuously built and released to our integration environment for validation.

Our team isn’t unique. Many of my colleagues work in distributed teams. Surprisingly, we all share the same common challenges. Time zones, cultural differences and language barriers are now part of each decision we make. Fortunately we all speak a common language, C#.

With this added complexity, how did we go to production on time? We implemented a process that we continuously tweak and adjust to meet our goals. Continue Reading…


iceberg-top-down-testing

With an ecosystem of devices that is constantly evolving, it’s hard to predict who will consume your REST APIs and how they will consume them. Devices as we know them are changing shape, consequently applications are constantly adapting to new platforms.

Lets face it, mobile apps are the norm and shouldn’t be ignored. They travel and shouldn’t be considered as sedentary.

Imagine a scenario where you are on a business trip and try to fetch today’s news from a different continent. Your device formulates a URI with a date and calls a service… no results…
That’s weird! Its 10 PM and I should get results for today’s news. This is probably the moment where you realize that you aren’t in your normal time zone… your phone has adapted to your new geographic location but the APIs the app is calling has not!

Continue Reading…


Confused-face-e1268404997164-266x300

If your development computer isn’t set to the correct date / time and you are working with Windows Azure you might end up like this guy! I know I did!

Recently I tried to shift back the clock on my developer computer so that I could test a caching mechanism.

After setting the clock back one day, I was unable to connect to Windows Azure services like Blob Storage Service, Queue Storage Service and Table Storage Service. I constantly got HTTP Status Code 403 Forbidden.

I logged into the Windows Azure Management Portal and saw that everything was as it should. I checked the Windows Azure Service Dashboard to see if any services were down… I even check that my storage keys had not been regenerated.

I fired up fiddler and got nothing more than I already knew… then it came to me, whenever http calls are made, the client sends a timestamp to the server and whenever the client and the server are out of sync, calls are often refused. So I set my clock back to the right time and the problem went away.

All in all, this was my fault, I tried to fool the system and I got caught.

Everything boils down to the fact that the Authentication for the Windows Azure Storage Services accept request timestamps who’s date times that are within 15 minutes of the services’ current time.

All authenticated requests must include the Coordinated Universal Time (UTC) timestamp for the request. You can specify the timestamp either in the x-ms-date header, or in the standard HTTP/HTTPS Date header. If both headers are specified on the request, the value of x-ms-date is used as the request’s time of creation.

The storage services ensure that a request is no older than 15 minutes by the time it reaches the service. This guards against certain security attacks, including replay attacks. When this check fails, the server returns response code 403 (Forbidden).


dance-midnight-stars-love-romance-clouds-other-2048x2048

A few of my peers have recently move to the cloud and some have found out that we can no longer ignore time zones!

DateTime is an object that is familiar to us. Most of the time we use it without any second thoughts. There lies our first mistake. Not using the DateTimeOffset in code that runs on the cloud can result in some strange and unwanted behavior.

The reason DateTime is usually the reason why our dates are off, is that by default it creates dates in local time. To create a date in UTC you are required to specify a DateTimeKind.Utc in the date’s constructor.

Imagine the following scenario. A developer is executing code in the Windows Azure emulator to debug the code before deploying to the production environment on the cloud. The configurations of the code running in the emulator if pointing to the production database and the code is creating dates using the DateTime object. These dates are then inserted into Windows Azure SQL Database. Then the developer, who is satisfied with the results, goes to production with the new code.

A few days go by, then suddenly a client calls to report weird dates and that the application isn’t working as expected. Tickets are created in the past! A second customer also reports this weird behavior, but he reports that dates are in the future!  Just so we’re clear, time travel is not a possible cause…

Continue Reading…


Its no secret that Windows Azure Roles, Services and SQL Database Time Zones  are set to
UTC (Coordinated Universal Time). Forgetting this fact can get us in trouble! When we forget about
Time Zones, any operations dealing with dates and times are potential bugs!

For example

  • Scheduling jobs
  • Comparing dates and times
  • Performing operations on dates and times
  • Parsing string dates and times from the client
  • Storing dates and times in SQL Database

We naturally refer to time using our current Time Zone. Imagine that a system requires a specific Job to run every Monday at 16h05 GMT –5. Its very likely that I will create a new DateTime instance that represents 16h05 and that I will happily use it to schedule my daily Job.

Continue Reading…


I was looking for a way to create Cron jobs in Worker Roles on Windows Azure and I found Quartz.Net. A Nuget package which is still being maintained as of January 2013. The documentation wasn’t great and was mostly out of date. Furthermore the API seems to have evolved quite a bit over time. The framework seems quite capable and very flexible. I’m barely using any of its potential at the moment! It has schedules and triggers, with a better understanding of its functionality, I would probably grant it more responsibility in my workflow.

The following is an example that demonstrates how I was able to successfully schedule jobs.

Continue Reading…