Oct 06, 2020

My Website on AWS S3 using Hugo and Github Actions

Initially I simply hosted my website on a small hosting package using static HTML pages. The drawback of this approach is that it is laborious to add new content, since an entire HTML page has to be built repeatedly. Therefore I decided to change the website to HUGO.

Hugo is a (fast) GO based static website generator which makes it easier to manage the website. Using Markdown you can create new pages and posts, which are automatically converted to HTML using predefined layouts. This allows, for instance, to dynamically expand these notes (blog) without having to add new HTML pages by hand. In addition, up-to-date content is possible without the use of PHP or other server-based languages. Since the output of hugo is just plain html code, I decided to deploy the website via a S3 bucket on aws. By enabling the hosting feature and setting the permissions to public the S3 bucket is publicly available trough http.

Routing requests to S3

S3 does not support https, therefore AWS CloudFront (CDN) is placed in front of the S3 bucket. For routing the www. address a simple CNAME record is set. However this approach is not possible for the root domain, since CNMAE only works for subdomains. Some providers (e.g. AWS, porkbun) support setting an ALIAS record to solve this. Unfortunately this flag is a non-standard DNS record. If ALIAS is not supported, Route53 from AWS may be a suitable option. Therefore the domain has to be registered by AWS or the AWS name servers have to be set on part of domain registrar. Besides SSL termination and redirection from http to https, CloudFront also handles edge caching. This results in a faster delivery of the pages.

Below illustration shows the interaction of the individual components:

AWS S3 - website setup

Github Action for automatic deployment on S3

I always wanted to try out Github Actions, therefore the main branch now synchronizes itself with the S3 Bucket. All you have to do is to create a S3 access and secret key with the required permissions based on a policy. Afterwards the secrets are added to the repo on Github. Finally this github action configuration does the magic:


name: Hugo S3 deployment
on: push
jobs:
  deploy:
    runs-on: ubuntu-18.04
    steps:
      - name: Git checkout
        uses: actions/checkout@v2

      - name: Setup hugo
        uses: peaceiris/actions-hugo@v2
        with:
          hugo-version: "0.75.1"

      - name: Build
        run: hugo --minify

      - name: Sync to S3
        uses: jakejarvis/s3-sync-action@master
        with:
          args: --acl public-read --follow-symlinks --delete
        env:
          SOURCE_DIR: 'public'
          AWS_S3_BUCKET: ${{ secrets.AWS_S3_BUCKET }}
          AWS_ACCESS_KEY_ID: ${{ secrets.AWS_ACCESS_KEY_ID }}
          AWS_SECRET_ACCESS_KEY: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
          AWS_REGION: 'eu-central-1'

First hugo compiles the site in a minified version, afterwards the resulting public dir (containing the html/css files) gets pushed to the S3 bucket.

Of course the overall setup is a bit overkill for a simple website, but the own website is also a nice playground for a small pipeline 😉