DevOps Deployment Framework. The rise of dodo

Here at Webjet IT we live and breathe DevOps. Our DevOps journey has been one borne out of automation and a cultural shift in terms of developing and operating production environments.  This first blog will outline our view on automation and how this has help define a framework that has helped improve the cycle times for our pipeline management.


We strive to automate our software delivery pipeline and ensure our software quality is of the highest standards. Automation is a topic that contributes to the build, deployment and testing pipeline and it involves a lot of scripting. Our pipeline consists of Octopus deployment, mostly PowerShell scripts, Teamcity build and Azure infrastructure.

Most of the time when writing scripts around a process, it’s a process that occurs often… at times too often. For example, our deployment script would look very similar for one product and another. A good example of this could be when a person writes a script to automate deployment of an IIS website, another person would come along a few days later and would want to do the same thing. In this scenario, we’d end up with two implementations of the same solution.  The same applies to software builds, running tests, deployment scripts and many other processes.

Centralised Code Repository for DevOps

Our first approach to solving for duplicating workloads, was to house all these “helper” scripts into a DevOps code repository where IT staff could help themselves to re-use some of the scripts other people have written.

The problem with this solution is isolated scripts cannot really be versioned correctly. Versions cannot be enforced and changes to a centralised scripts can cause breaking changes for others using the same set of scripts.

One perfect example of this was the implementation of our scripts to allow automated deployments of Azure Web Apps. In our DevOps code repository we started writing a set of helper scripts that we could use to interact with the Azure API to deploy Web Apps to Azure cloud. The scripts would allow developers to pass input and expect full deployment and management capabilities of the Azure web app that was being deployed. Our development teams could deploy their apps with single commands, perform configuration updates and slot swaps to make applications active between staging and production slots.

It was a matter of a 2-3 weeks and we were already deploying about 10 web apps to production using the centralized set of scripts

Life was good!

The Azure API is simple to use, but often one line commands are not good enough and defensive coding practices usually end up in many more lines of code that need to be maintained. Centralised framework for these implementations was needed.

DevOps WebApp Deployment Framework was born

We were convinced that what we had was not good enough. Although the scripts were centralised in the DevOps code repository, development teams were still required to physically copy the code out into their build artifacts  so that it can be used. By copying code, you lose versioning.

We created a separate code repository to house the newly formed “DevOps Azure Web App Deployment Framework” and implement tagging as a versioning mechanism.

Development teams would then use Git sub-modules to access the framework, rather than copying the code out. This allows developers to pick the version “tag” of the managed deployment framework that they want to use.

The framework quickly evolved from there and Azure Web-jobs deployment feature was added.

Life got even better!

Development teams were consuming the framework on a number of different Azure web app and web job solutions and it hardly required any bug fixes. Git submodule introduced it own problems and I had to think of a better approach to consuming the framework.

PowerShell modules were exactly what we needed. They are centralised, self contained, versioned and can live on many machines with many different versions on the same machine. PowerShell modules can also be consumed by a single shell instance in memory at runtime which means it does not have to be installed on a machine if you don’t want to install it.

DODO was born!

     ______ __
   {-_-_= '. `'.
    {=_=_-  \   \
     {_-_   |   /
      '-.   |  /    .===,
   .--.__\  |_(_,==`  ( o)'-.
  `---.=_ `     ;      `/    \
      `,-_       ;    .'--') /
        {=_       ;=~`    `"`
         <<__ \\__

We needed to call this module something. Give it a code name that be catchy and become popular among the teams at Webjet.

The DevOps Deployment Orchestration  module (DODO) was born.

The latest code from the DevOps Web App Deployment Framework was copied out into a new repository and converted to a PowerShell module.

We decided to simplify the input and make every command of DODO take a JSON object, JSON string or JSON file path. This means the user can simply use a deployment template and pass it to a DODO command which will perform the actions.

Development teams no longer had to use GIT sub-modules and did not require the code inside their repositories. Where ever we have an Octopus deployment tentacle agent, we’d install DODO and development teams can simplify their deployment scripts by calling the one liner commands with JSON input.

Example: Publish-DODOAzureWebApp $parameters

$parameters would be a json object which houses the required parameters to deploy such as Azure subscription ID, web app name etc.

Azure Deployments and Infrastructure deployments

DODO grew very quickly to support most of the Azure infrastructure.

You can spin up an entire environment including Virtual Networks, Web Apps, SQL servers + Databases, Redis Cache, Storage Accounts, Security Groups, Load Balancers, Automation accounts, Run books, Key Vaults, Cloud Services, Virtual Machines and perform DSC operations.

DODO also supports various windows automation features, such as installing IIS, web applications, application pools, Application Request Routing features, rules, probes and server farms.

Future of DODO

We’ve recently created the command line (dodo.exe) version of DODO which is just a C# console application that runs DODO operations.

Because DODO is mainly written in PowerShell, the EXE simply runs a C# PowerShell runspace that imports DODO into a shell and runs commands on top of the JSON files that are passed in.

The beauty about this is development teams no longer needed knowledge of PowerShell and could simply call DODO.EXE from their deployment script.

Example: dodo.exe C:\path\to\deployment\template.json

As we shift more and more software into the cloud, DODO would continue to target all our automation needs. Possibly also focus more on build and testing rather than only deployments.

There are also talks on potentially moving DODO to the open source space – but who knows 🙂

The future is bright 🙂