Serverless Publishing

Posted by Chris Rosser on Mon 28 May 2018
Hello! This site is archived and no longer maintained. For Chris' main site go to

Since 2014 I’ve hosted this website on DigitalOcean, an outstanding provider of Virtual Private Servers (VPS). A VPS is exactly what it sounds like: a virtualised server, complete with root access to a Linux distro of your choice, hosted on cloud infrastructure. DigitalOcean is by no means unique, but they are inexpensive and perform very well. They make it incredibly easy to spin up a server and thanks to them I was able to learn a lot about Linux and move away from the dodgy world of shared hosting.

Throughout that time, my site and my email server ran off a single VPS running Ubuntu 14.04. It’s performed very well indeed, but now its 2018 and next year 14.04 will reach the end of its life and I have the unenviable task of migrating the system to 18.04.

Migrating the whole system will take me at least a day; then there’s the potential downtime where my domain name’s DNS records are updated and propagated throughout the internet. A few years ago, that wouldn’t have bothered me, but with the immanent publishing of my first book (and more to follow), I can’t afford the downtime.


There’s other things I’d rather spend my time doing other than managing Linux servers.

Fortunately, the world’s moving fast and serverless hosting is now a thing. It’s not a new thing but it’s a growing trend, driven by heavyweights Amazon and Google, and to a lesser extent, GitHub.

The idea of serverless hosting is simple: host a website on someone else infrastructure — let them handle the underlying operating system, patching and updating. The drawback is most serverless hosts don’t let you run server-side scripts such as PHP, making it impossible to run Wordpress. That’s fine by me, my website has been static since 2013.

So, I decided that it made a lot of sense to decouple my website from my mail server and migrate it to a serverless provider.

I evaluated the obvious choices of Amazon S3, Google Cloud and GitHub pages. As part of the shift, I knew I wanted to store my website’s content in Git and be able to publish with a push command. That made GitHub Pages highly appealing, not least because the static site generator I use has instructions for doing so.

However, in my search I came across the relative newcomer, Netlify. They offer all of the advantages of GitHub Pages with nones of its drawbacks. Basic hosting of static sites is free, including on their world-wide content delivery network (CDN). They provide TLS certificates to encrypt site traffic. They even give you enough compute to generate a static site using the engine of your choice1. Add to it the availability of AWS Lambda Functions, Identity and Forms and Netlify stretches the limits of what a static site can do.

Best of all, the build can be kicked off automatically from a push to a Git repository hosted on GitHub, BitBucket or GitLab.

Oh, and, did I mention it was free?

The move to Git and BitBucket

The move is predicated on the requirement (and my desire) to manage my website using Git.

Previously my site's content was stored in Dropbox and from there I ran a build script that sent the content off to my server were it was compiled into my website using Pelican. Although Dropbox allowed me to write content (almost) everywhere2, the build process could only be initiated from my MacBook.

So, I’ve moved to Git, putting my site’s content under revision control and hosting it on BitBucket, Aussie company Atlassian’s GitHub alternative. I chose BitBucket because they provide private repositories for free.

With the site hosted on BitBucket I can add and edit content easily on my MacBook, the BitBucket website or even on iOS using the excellent Workingcopy app

As for writing content — well, I’ve got something of a bombshell to drop there, but I’ll address that in my next post!

Building static-sites with Netlify

Netlify’s stated missions is to:

Build, deploy, and manage modern web projects

The platform is targeted at developers who create web apps with HTML/CSS and JavaScript who get their data by consuming JSON APIs rather than directly querying a self-hosted database. In other words — a static website.

Building with Netlify is surprisingly easy — much easier in fact that messing around the Amazon’s S3 buckets and Cloudflare. Considering that my site is built with a lot of content, plus several Python libraries, Pelican plugins and my own theme, I wasn’t really expecting it to work.

Over the last week, I created a test project and one-by-one I added the libraries, plugins and finally my theme. I tweeted the results over on Twitter.

Pelican isn’t as popular as other static-site generators like Jekyll or Hugo, but Netlify fully supports Python and the Makefile I use to build the site. All I had to do was provide a requirements.txt file containing the modules and libraries I need.

Setup up via the dashboard is intuitive and easy. The build process allows to inject your own headers (basically Netlify’s take on .htaccess) and even gives you the ability to compress your images (losslessly) and bundle some of the CSS and Javascript files in the name of tuning performance.

The only thing I lost was the ability to use MultiMarkdown as my markdown compiler, but I was able to cobble together most of its capabilities using Python markdown extensions. Many third-party extensions aren’t available directly through PIP, but I found I could reference their GitHub repository directly in the requirements.txt file, like so,


The extension I referenced above adds figure captions, something MultiMarkdown does but Python Markdown does not.

Completing the migration

To complete the migration, all that was left was to change my domain’s DNS settings to point to the IP address of Netlify’s CDN. Once that occurred, I was able to provision my LetsEncrypt TSL certificate and resume serving content over HTTPs. The process began to happen within 30 minutes of updating my DNS.

Thanks to this approach, there was no downtime, even during the DNS migration. I left an identical copy of my site running on my DigitalOcean droplet so that DNS servers that hadn’t updated yet would direct people my old server.

What about DigitalOcean?

Well, I still use and recommend DigitalOcean. It will continue to run my email server for the foreseeable future. However, by decoupling my website, I’m no longer faced with the prospect of pulling it offline while I migrate the mail server to Ubuntu 18.04.


So far, so good. By the time I publish this post, it will have been live on Netlify’s CDN for about 24 hours. Because I’m using a CDN, my site will load faster for people outside the East Coast of the USA (where I host my VPS). Initial speed tests (using Pingdom) are positive, and there’s room for optimisation, which I’ll tackle slowly over coming days.

In my next post, I’ll discuss how migration has allowed me to change the way I write content — and as I noted above, it’s a bit of a bombshell!

  1. GitHub Pages only supports Jekyll. 

  2. Dropbox won’t run everywhere, not least my Raspberry Pis. 

Wow you read this far! This site is archived and no longer maintained. For Chris' main site go to