These days I’m all about automation. As most of us are focused on Python, C# JavaScipt and Node I’m taking a different approach to Azure DocumentDB. This experiment’s goal is to facilitate the creation and seeding of DocumentDBs with very little effort from JSON documents stored an Azure Blob Storage container.

Meet DocumentDB

Azure DocumentDB is a NoSQL document database service designed from the ground up to natively support JSON and JavaScript directly inside the database engine. It’s the right solution for web and mobile applications when predictable throughput, low latency, and flexible query are key. Microsoft consumer applications like OneNote already use DocumentDB in production to support millions of users.

Continue Reading…

With the latest update of the Semantic Logging Application Block – Out-of-process Service NuGet Package, the user under which the Windows Service is executed has changed. In hopes to save you countless hours of debugging I am sharing the configurations that should be used on Azure Cloud Services.

When you start the Out-of-Process service be sure to specify the LocalSystem account.

SemanticLogging-svc.exe -s -a=LocalSystem

Failure to execute under the right account will prevent the Out-of-Process service from logging Events to Azure Table Storage.

What is SLAB?

The Semantic Logging Application Block (SLAB) provides a set of destinations (sinks) to persist application events published using a subclass of the EventSource class from the System.Diagnostics.Tracing namespace. Sinks include Azure table storage, SQL Server databases, file, console and rolling files with several formats and you can extend the block by creating your own custom formatters and sinks. The console sink is part of this nuget package. Other Sinks mentioned above are available as separate nuget packages. For the sinks that can store structured data, the block preserves the full structure of the event payload in order to facilitate analysing or processing the logged data.


What is Azure API Management

Microsoft Azure API Management is a service that helps protect your mission critical systems with authentication, rate limiting, quotas and caching to ease load under pressure. Rest easy knowing that only the partners, developers and applications you’ve authorized have access to your APIs and that those groups are acting in accordance with your policies. Find out more on Azure.com


Like many services on Azure, API Management provides us with a comprehensive REST API. This API allows us to manage Users, Groups, Products and Subscriptions.

Working on a multi-region solution, I was really happy to see these APIs. One of the recurring challenges I face everyday, is to replicate my efforts across multiple deployments sprawled over many Azure regions. Now the only way to do this effectively is to automating everything!

As of August 2014 API Management is still in public preview and is going through constant evolution. New features make their way to production and pieces fall together. The newly released REST APIs are just that, a piece that was missing. Wanting to reduce my workload I decided to create a PowerShell Module to help automate some of my repetitive tasks.

Note: The API does not allow you to define APIs, Representations. The Developer Portal CMS is not accessible through these APIs. Although these are things I would love to interact with through the REST API, I’m hopeful that something will come along.

Before we start, there are a couple things we need to do. First we need to activate the Management APIs on our API Management service. Then we need to generate an access token. I opted for the manual process which you can follow on the Azure API Management REST API Authentication page. Continue Reading…


Rebuilding SQL Database Indexes

A few months ago I wrote a blog post titled “Don’t Forget About Index Maintenance on Azure SQL Database“. Since then, Microsoft Azure SQL Database has changed a lot. We aren’t as concerned about the size of the database anymore, because databases can reach 500 GB in size. Take a moment to think about that number. 500GB is a lot of data! Before you get excited and move on to more important things, ask yourself this question, does all that data really belong in my SQL Database? Put some thought into it, you may be surprised by the answers you come up with. Continue Reading…


Using PowerShell to Authenticate Against OAuth

From development to deployment, PowerShell is becoming the ‘go to’ automation technology on Microsoft Azure. So, I decided to use PowerShell to perform automated tests against a Web API (a.k.a REST service). These tests are built to run during the execution of a Continuous Release cycle and confirm that the API is responding as expected.

Continue Reading…


How do You Version Packages?

This topic seems to come up regularly. And is usually a friction point for teams because many of us do this differently. In an attempt to standardize versioning and to remove this undesired friction I decided to promote Semantic Versioning (http://semver.org/). Continue Reading…


Busy… Starting Role… Repeat

So you deployed a Cloud Service and it’s status is stuck on “Busy (Starting role… Sites were deployed.)” By now you’ve probably checked the SDK assembly versions, checked that your Cloud Service runs in the Azure Emulator and probably everything else that you could think of. So what’s next?

In August 2013  published an awesome blog series titled Windows Azure PaaS Compute Diagnostics Data. It contains a wealth of priceless information that ended my 24 hour debug session. Continue Reading…


How do you Update Your Wetware?

I’m passionate about technology, but sometimes I need a wetware update. A reminder that technology needs people to change the world.

Wetware is a term drawn from the computer-related idea of hardware or software, but applied to biological life forms. Here the prefix “wet” is a reference to the water found in living creatures. Wetware is used to describe the elements equivalent to hardware and software found in a person, namely the central nervous system (CNS) and the human mind. The term wetware finds use both in works of fiction and in scholarly publications.

Communicating effectively doesn’t come easily. Surprisingly, the skills required to be heard include paying attention to others. Listening with intent and genuine interest has a greater effect than one can imagine.

As a technology enthusiast, and as a dreamer, it’s important that I remember that I need to communicate to succeed. Ideas grow and mature to fruition through collaboration and care.

Curiosity drives me to update my wetware through various activities. For one, I am an active participant in my local and worldwide community where I learn about technology. By interacting with others, I learn from their experiences and expand my understanding. I listen to podcasts and comment on blog posts. I ask and answer questions on social networks like Stackoverflow and I participate in forums.

Human interaction is at the center of my personal growth and this fact has driven me to two wonderful books. The first book that I had the pleasure of discovering is “Speaking as a Leader“. This book is rich with advice about how to move others with your vision. Using the right approach and the right tools makes a world of difference. The second book is time-tested and has captured the interest of millions. “How to Win Friends and Influence People” is a book that conveys principles through stories about people who have left their mark on history.

Both books complement each other really well. And both are rich with a wealth of wisdom that begs for a second read.

How do you update your wetware?


Automate Everything!

These two words immediately caught my attention! This is one of the hardest things for me as a developer. I spend most of my time designing and coding away, but I hardly spend anytime with PowerShell… so I’ll be blunt, as a developer, I must learn PowerShell and I must change my Definition of Done (DoD). I am not done until my feature can be deployed repeatedly and reliably. Automation makes me sleep better at night because my deployments yield predictable results.

This new e-book  is all about taking advantage of what the cloud has to offer. It’s packed with best practices for DevOps, data storage and high availability. Since the authors took a pattern-based approach, each chapter can be read independently.

Download all formats (PDF, Mobi and ePub) as well as link to the companion content hosted by the Microsoft Virtual Academy.


Getting Acquainted With #Azure Service Bus Event Hubs

The Microsoft Azure ecosystem just keeps growing. This week Microsoft unveiled a very welcomed addition to the Microsoft Azure Service Bus. Event Hubs join ranks with Queues, Topics and Relays to offer options adapted to your needs.

Contrasting available Service Bus Flavors?

  • Relays – are used to bridge communications over the cloud in a secure and transparent manner.
  • Queues – are pipes that allow for many publishes and many consumers to communicate over a single channel. This is great for Competing Consumers and for Queue-based Load Leveling.
  • Topics – are pipes that allow fan out scenarios, where each consumer gets his own copy of the inbound queue. It also has some handy features like filters. Use this flavor to implement Pipes and Filters.
  • Event Hubs – are a bit more complex. Event Hubs enable the collection of event streams at high throughput, from a diverse set of devices and services. In other words, they help us deal with the 3 Vs.
    • Volume (amount of data)
    • Velocity (speed of data in and out)
    • Variety (range of data types and sources).

Microsoft Azure Service Bus Event Hubs

Event Hub join ranks with Queues, Topics and Relays to offer options adapted to your needs. They province the mechanisms necessary to collection of event streams at high throughput, from a diverse set of devices and services. They are composed of a Published Policy, of Consumer Groups and of Partitions.

Event Hubs support the following scenarios:

  • Collecting event streams at high throughput from devices/services for use in real-time and batch processing.
  • Connecting millions of devices from diverse platforms for sending data (with individual authentication and flow control).
  • Process event streams per device “in order” using several backend services (publish/subscribe).

Considerations Prior to Creating an Event Hub

You must put some effort in capacity planning before you create an Event Hub. In order to make the right decisions let’s go over a couple details about Event Hubs. Continue Reading…