Automated Hugo Releases With Azure Pipelines

Thursday, August 22, 2019 6:18 AM

Last week I wrote how I migrated my site from a dynamic CMS to a static site generated using Hugo. The site & all supporting processes are hosted in Microsoft Azure. In this post, I will explain how the site is automatically built and deployed when I either push new/updated content as well as on a scheduled basis. This is all implemented using Azure DevOps pipelines.

In another post, Hosting Hugo on Azure, I detailed the topology of how the site is hosted and exposed to the world using Azure resources so I won’t be covering that here.

Before you dive in, a word of warning: this post is going to be very long & comprehensive. I’m documenting this for myself as my process stands today at launch as of August 2019. Let me first explain the high-level process so you can get a complete picture of the entire build & release flow.


The CI/CD process I’ve configured for the site is designed to be copy-paste in for anyone using Hugo and the setup that I’m using to host it. Specifically, storing the Hugo site & content in an Azure DevOps repo and hosting it using Azure Storage Blob support for static websites. Just do the following:

  1. Copy & paste in three YAML files that define the Azure DevOps pipeline: one main file & two templates for building & deploying
  2. Create a variable group with a set of three name-value pairs
  3. Optionally modify any parameters, but the defaults will likely be good enough for all

To recap, my setup includes two sites: a live site ( & a preview site (URL not shared) that includes all draft & content to be published in the future & excludes all content that will expire.


So what does my pipeline do? My requirements are as follows:

  • Build & deploy both sites immediately when updating to the master branch
  • Build & deploy only the preview site immediately when updating any branch: This allows me to share or preview the site live for any new features or content I’m working on.
  • Build & deploy only the master branch on a regular schedule Monday-Friday: I like to write a bunch of blog posts at once and publish them over time so I use Hugo’s publishDate front matter capability. Because Hugo is static, only content that has no publishDate set or where the publishDate is before the build time will be included. Therefore, I rebuild and deploy the site periodically throughout the day to give me that CMS-like publish schedule.

These requirements are fulfilled using Azure DevOps Pipelines. It is implemented entirely in YAML with three variables set in a DevOps variable group that define the target site’s URL & credentials to the Azure Storage Blob where the files are stored.

Azure DevOps Pipelines Overview

If you’re familiar with Azure DevOps you might want to skip this section.

Azure DevOps Pipelines are broken down in multiple stages. Typically stages flow sequentially but support branching out and having one stage depend on other stages and even depend on no previous stages at all.

Early on, Azure DevOps supported builds & releases. You could define builds using the browser-based tools or YAML. Releases were typically created and managed through the browser-based tools & could contain multiple stages. In the more modern pipelines, you can define any number of stages how you see it and define them entirely in YAML.

For my setup, I have two stages: build & deploy. The build stage is ultimately responsible for creating the files for the site. These are generated using the Hugo executable on Linux, MacOS or Windows. While I use Linux for my builds nothing is specific to Linux and with very little work, you can switch it over to MacOS or Windows if you like. The deploy stage takes the files from the build stage and uploads all added/updated files to the Azure Storage Blob as well as deletes any files no longer needed.

Pipelines can get complicated quickly. To simplify things, they support templates for both jobs and steps. In my configuration, I have a template for the build job and a template for the deploy job. Each accepts a number of parameters for configuration.

Azure DevOps Pipeline for Building & Deploying Hugo Sites

The pipeline is implemented with the file azure-pipelines.yml in the root of the project. The top of this file contains the rules that tell Azure DevOps when the pipeline should run:

There are a few things to note here:

  • trigger: This section says “always run the pipeline when there’s a push to any branch”. I control the specific rules of which site should be built within the jobs themselves.

  • schedules: This section tells Azure DevOps to run a build on the master branch at 8:03a, 11:03a, 1:03p & 6:03p ET (the times listed in the cron job are UTC… I’m -0400 or -0500 depending on the time of year) Monday-Friday. The always: true says “always run this build, even when the source code hasn’t changed.” This ensures any posts that should be published simply because the time of the build is passed the publishDate of the content will show up on the site.

  • variables: This section defines the name of the variable group I’ve created in my DevOps account as you can see here:

Now, let’s look at the build stage.

DevOps Stage 1: Building Hugo Sites

We’ll start with the build stage. This is defined in the build job template, saved in my project as ./build/site-build-job.yml:

The job well documented with comments as you can see in the snippet above, but I’ll explain each step in more detail in a moment. Before we do that, let’s jump back to the azure-pipelines.yml file and see how this job is run for my live site:

Here you see where I’m referencing the job template above and passing in some parameters. A few things to call out:

  • condition: This ensures this job is only built when the master branch is the trigger for the build.
  • hugo_*: These parameters define values that will be added to the Hugo executable when it’s run on the command line.
  • site_storage_*: These ultimate will set the values of two environment variables, AZURE_STORAGE_ACCOUNT & AZURE_STORAGE_KEY, set when I run the hugo deploy command. I explain how this works later when we look at the specific steps in detail.

Here’s what the build log looks like when it runs. I’m only covering the live site in the detailed explanation, but here you can see the preview builds as well.

Let’s jump back to the job template file, ./build/site-build-job.yml, & spend look at the actual steps in detail. However, if all this makes sense to you so far, you may want to jump down to the section DevOps Stage 2: Deploying Hugo Sites.

The first part of the template worth noting is the pool/vmImage: ubuntu-latest. This is where I’m telling DevOps to run the build on the latest supported version of the Ubuntu (Linux).

Build Step 1: Download & Install Hugo

The first step is to download the desired Hugo version & install it:

I’m pulling a specific version from the list of releases the Hugo team publishes with each release on Github.

Build Step 2: Build the site Using the Hugo Executable

With Hugo installed, build the site. DevOps will have already cloned the repo.

Notice I’m using the parameters passed into the job to control some arguments on the hugo executable.

Build Step 3: Log all Files to be Uploaded & Deleted

This step isn’t necessary here as it’s really only needed in the deploy stage, but I like to run it just to get the information in this pipeline’s execution log.

This is using the relatively new deploy command Hugo introduced. The argument --dryRun tells Hugo to just log what it will do, but don’t upload them. This command is for those who are hosting their site in one of the three big cloud providers: Google Cloud Platform, AWS or Azure. Within my site’s config.yml file, I have added the following entry:

    - name: "azure accom"
      url: "azblob://$web"

The name isn’t important. The deploy command will use this deployment (as it’s the only one listed) to learn “this is an Azure Storage Blob” and “the files go in the $web container”. It will login to Azure using the values within the AZURE_STORAGE_ACCOUNT & AZURE_STORAGE_KEY environment variables I set in the task above.

Here’s what the output of the command looks like… much better than using the DevOps Azure File Copy task to upload over a GB of files!

Build Step 4: Zip all Site Files (with a hack)

With the site built, we have everything we need so it’s time to publish build artifacts so other stages, like the deploy stage, can use it. Hugo builds the site in the folder ./public.

At first I was publishing this entire folder as an artifact, but I saw that take nearly 20 minutes. Why? My site consists of over 6,000 HTML files + all the referenced media. I think the quantity of files was slowing things down. I had the exact same problem over on the deploy stage when it was pulling the archive down. Factor in two sites and I was looking at a 60+ minute build+deploy on every push. Nope… got to do something about that…

So I tried zipping only the necessary stuff up, then publishing the resulting ZIP. While the zipped file is nearly 1GB, the resulting ZIP+copy time is roughly 50 seconds. Yeah, that will do!

Build Step 5: Publish Build Artifacts for other Stages

The last step is to publish the artifacts for future stages, like the deploy stage:

Here you see I’m publishing the Hugo installer (no sense in downloading it from GitHub again) and the site files ZIP created in the previous step.

DevOps Stage 2: Deploying Hugo Sites

Now that the build stage is done, let’s at the deploy phase. This is defined in the deploy job template, saved in my project as ./build/site-deploy-job.yml:

The job well documented with comments as you can see in the snippet above, but I’ll explain each step in more detail in a moment. Before we do that, let’s jump back to the azure-pipelines.yml file and see how this job is run for my live site:

Nothing special to call out here as there are even less parameters than in the build stage.

Here’s what the deploy log looks like when it runs. I’m only covering the live site in the detailed explanation, but here you can see the preview deploy as well.

Let’s jump back to the job template file, ./build/site-deploy-job.yml, & spend look at the actual steps in detail.

Deploy Step 1: Download Build Artifacts

We first need to get the things that were acquired or created in the build stage, so let’s download those two things: the Hugo installer & built site files in a ZIP:

Deploy Step 2: Install Hugo

Next, install Hugo from the installer we just downloaded. Nothing different here from the build stage:

Deploy Step 3: Extract Built Site Files (zip)

We zipped up the built files in the build stage, so we need to unpack everything in order to deploy them:

Deploy Step 4: Upload New/Changed & Delete Removed Files

As I explained above in the build stage, Hugo’s deploy command can deploy a Hugo site to a public cloud provider. The only difference here from the build stage is the omission of the --dryRun command:


That’s it! While they are all listed above, here are the files I use to implement the entire process:

I have some plans to improve this process by adding new functionality and making this process a bit easier, but for now, this works for now. Depending when you’re reading this, you might want to check the Hugo category as I may have already made some of these changes.

comments powered by Disqus