• Home
    • Reading
    • Sharing
    • About
  • Home
  • Reading
  • Sharing
  • About

Deploying Hugo to S3 via Bitbucket

Apr 9, 2020 · 604 words · 3 minute read
bitbucketwriting

I recently switched this site to Hugo.io - a static site generator. That is, the website is generated once, at the point content is changed. This is different than many content management systems today, which generate the content dynamically for each request.

The result is a website that loads fast - very fast.

It also makes it simple to host anywhere that can serve files - which is almost everywhere.

Hugo to Build Blazingly Fast Websites

One of the advantages of Hugo, is that all the files needed to generate a site can be stored under source code control. I happen to use Bitbucket.

Adding new pages or posts to a website is as easy as running the command: hugo new posts/my-first-post.md which creates a new page. Then, editing the page using markdown. And finally generating a new site with the hugo command.

The updated file can be committed to the source repository and the newly generated site can be deployed to an appropriate host. In my case, I’m using AWS S3.


Hosting on S3 does have it’s challenges. The first of which is how to get the newly generated site copied over.

At first, I used an rsync tool called Transmit by Panic. It’s a fantastic utility. However, I didn’t want to keep syncing manually each time I updated a file.

Instead, what I wanted was every time I committed a file to source code control, to have it generate a new site and upload the result to AWS.

Automatically Deploying S3 Using Bitbucket Pipelines

Fortunately, Bitbucket has a feature called pipelines, which is an automation tool for taking various actions after code commits.

It took a fair amount of research to determine the best way to do this. And to hopefully save you some trouble, I’ve included my pipeline here.

    image: atlassian/default-image:2
    
    options:
      max-time: 5
    
    pipelines:
      default:
      - step:
          name: Build Hugo
          script:
            - apt-get update -y && apt-get install wget
            - apt-get -y install git
            - wget https://github.com/gohugoio/hugo/releases/download/v0.68.3/hugo_0.68.3_Linux-64bit.deb
            - dpkg -i hugo*.deb
            - hugo --minify
        artifacts:
            - public/**
      - step:
          name: Deploy to S3
          deployment: production
          script:
            - pipe: atlassian/aws-s3-deploy:0.4.1
              variables:
                AWS_ACCESS_KEY_ID: $AWS_ACCESS_KEY_ID
                AWS_SECRET_ACCESS_KEY: $AWS_SECRET_ACCESS_KEY
                AWS_DEFAULT_REGION: '<S3 Region>'
                S3_BUCKET: '<S3 Bucket>'
                LOCAL_PATH: 'public'

The first is to load v2 of the default image provided by Atlassian, as described here: https://hub.docker.com/r/atlassian/default-image

Options, set’s the build time to a max of 5 minutes, which is more than adequate for a Hugo build. This is primarily to prevent a runaway process from burning through all of a month’s build minutes.

Then the pipeline definition is next, with two steps defined.

The first step downloads and installs the v0.68.3 version of Hugo for 64-bit Linux. The latest releases and builds of Hugo can be found here: https://github.com/gohugoio/hugo/releases. With the final output stored in an artifact variable which will be used in the subsequent step.

The second step, deploys the public folder (defined as by our previous artifact variable) to S3. You’ll need to define two environment various, one for your Access Key Id and one for your Secret Access Key. And then update the <S3 Region> and <S3 Bucket> placeholders with your own region and bucket information.

Once this pipeline is defined, then every time you commit to this repository, Hugo will build with the result deployed to S3.

Using Forestry.io to Edit Hugo Pages

One last tip. I use Forestry.io as a WYSIWYG editor and front end to create and edit new Hugo pages. It’s set to use Bitbucket as the source repository. The result is that every time I save edits to a new page or blog entry, my website is automatically updated.


© Copyright 2023
Ron Lancaster
RSS JSON