Wednesday 30 September 2015

Publishing Hugo to AWS S3 with Make & git hooks

I’ve recently switched this blog to use hugo , a mighty fine static site generator written in Go. Static site generators are great because they have excellent response times (no application server or database) and are very hard to break (it’s just html). However pubishing new content does create a few hoops that need to be jumped through because you have to move files from a local machine to a remote machine.

I’m a fan of automation and so I wanted this process of moving files to be as invisible as possible. I acheive this by utilising githooks, a humble Makefile and the AWS-cli tool.

The Git hook

I piggyback the post-commit githook, which is a client side hook. Here’s what it looks like …

#!/bin/sh
# get the branch and store it in a variable.
branch=`git rev-parse --symbolic-full-name --abbrev-ref HEAD`

# I only want to "do stuff" if we're on the master branch
if [ "$branch" = "master" ]
then
    # the makefile does all the publishing and is explained below.
    make publish
fi

The file lives in your-repo/.git/hooks/post-commit and needs to be executable via chmod +x post-receive.

The Makefile

Here’s the makefile I used to do all the work above. I’ll go into each line after the contents …

publish:
    # 1. blow up the last deploy directory
    # I do this just to make sure I'm publishing the right build.
    # I never want to publish something that isn't ready.
    rm -rf deploy
    
    # 2. run gulp to build scss into css
    # I use scss to style this blog and when I publish I like 
    # to make sure I'm using the latest build of the css. 
    # I use a little cache busting technique to bypass
    #  the caching on S3, but more on that below.
    gulp css
    
    # 3. run hugo
    # This is straight forward, the only custom flag I use is to tell
    # hugo to publish into the `deploy` directory.
    hugo -d deploy
    
    # 4. cd into deploy, set up some temp envvars and run s3 sync
    # cd into deploy
    # set AWS_PROFILE so that the the aws-cli can authenticate as correct user
    # stash an md5 hash for cache busting
    # finally call aws
    cd deploy; AWS_PROFILE=jc CACHE=`date | md5` aws s3 sync . s3://jamie.curle.io --region eu-west-1

Cache Busting

S3 can be a little overzealous in it’s caching. I overcome this by using a hash of the current datetime as a querystring value that forces S3 to bypass it’s cache. It’s basic …

    CACHE=`date | md5`

I tie that into the template like this (_I’m using ace templating language as I

link rel=stylesheet href=/css/jc.css?cache={{ getenv "CACHE" }}

Finally we have the actual command that does everything: aws s3 sync . s3://jamie.curle.io --region eu-west-1. This makes very light work of syncing the current directoy to s3.

And that’s it - super simple publishing, super fast website hosting and best of all - it’s plain html.