Posted: 12/23/2020 • Updated: 01/01/2021
I wanted to create a small website that would give me a place to document my learnings from different areas of technology that I pursue. By writing articles for this website, I can reinforce my learnings by writing about my processes, and also use the content as a refresher in the future.
This website is built using the Hugo framework, a popular static site generator written in Go. I wanted something that was easy to create content for and easy to deploy.
Below are the steps I used to get this site hosted on AWS S3 with automated deployments using Github Actions.
The first thing we need to do is create a S3 bucket. This bucket will be where we are uploading all our static assets that are used in rendering the site contents.
There are many different ways to create a S3 bucket. Either directly through the AWS Console, using the AWS CLI, using Terraform, etc.
For this site, I ended up using Terraform as a way to manage my S3 bucket so that I have a way to reference and maintain it through code going forward.
The following code assumes you already have AWS CLI and Terraform configured on your local system and that your are deploying to the us-west-2 region.
S3 Bucket (publicly accessible)
Reference: https://registry.terraform.io/providers/hashicorp/aws/latest/docs/resources/s3_bucket
resource "aws_s3_bucket" "site" {
bucket = "<your-bucket-name>"
acl = "public-read"
website {
index_document = "index.html"
}
versioning {
enabled = false
}
}
S3 Bucket Policy (read-only)
Reference: https://registry.terraform.io/providers/hashicorp/aws/latest/docs/resources/s3_bucket_policy
resource "aws_s3_bucket_policy" "site_policy" {
bucket = aws_s3_bucket.site.id
policy = <<EOF
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "PublicReadGetObject",
"Effect": "Allow",
"Principal": "*",
"Action": [
"s3:GetObject"
],
"Resource": [
"arn:aws:s3:::<your-bucket-name>/*"
]
}
]
}
EOF
}
Example terraform plan
command output
Terraform will perform the following actions:
# aws_s3_bucket.site will be created
+ resource "aws_s3_bucket" "site" {
+ acceleration_status = (known after apply)
+ acl = "public-read"
+ arn = (known after apply)
+ bucket = "<your-bucket-name>"
+ bucket_domain_name = (known after apply)
+ bucket_regional_domain_name = (known after apply)
+ force_destroy = false
+ hosted_zone_id = (known after apply)
+ id = (known after apply)
+ region = (known after apply)
+ request_payer = (known after apply)
+ website_domain = (known after apply)
+ website_endpoint = (known after apply)
+ versioning {
+ enabled = false
+ mfa_delete = false
}
+ website {
+ index_document = "index.html"
}
}
# aws_s3_bucket_policy.site_policy will be created
+ resource "aws_s3_bucket_policy" "site_policy" {
+ bucket = (known after apply)
+ id = (known after apply)
+ policy = jsonencode(
{
+ Statement = [
+ {
+ Action = [
+ "s3:GetObject",
]
+ Effect = "Allow"
+ Principal = "*"
+ Resource = [
+ "arn:aws:s3:::<your-bucket-name>.com/*",
]
+ Sid = "PublicReadGetObject"
},
]
+ Version = "2012-10-17"
}
)
}
Plan: 2 to add, 0 to change, 0 to destroy.
Once you have verified your new S3 bucket has been created and that it is publicly accessible, we can move onto deploying our site contents to the new S3 bucket.
Since I am using the Hugo framework, it is really easy to generate the static files and deploy them to our S3 bucket.
Hugo Deploy Reference: https://gohugo.io/hosting-and-deployment/hugo-deploy/
Sample Hugo Deployment Configuration (config.toml
)
[deployment]
[[deployment.targets]]
URL = "s3://<your-bucket-name>?region=us-west-2"
name = "s3-bucket-deployment"
[[deployment.matchers]]
cacheControl = "max-age=31536000, no-transform, public"
gzip = true
pattern = "^.+\\.(js|css|svg|ttf)$"
[[deployment.matchers]]
cacheControl = "max-age=31536000, no-transform, public"
gzip = false
pattern = "^.+\\.(png|jpg)$"
[[deployment.matchers]]
gzip = true
pattern = "^.+\\.(html|xml|json)$"
Deploying Content
Run the following command to generate the static files that will be ultimately deployed.
hugo
Run the following command to deploy to the deployment target defined in the config.toml
configuration file.
hugo deploy
Here is an example of what the deployment command output might be.
Deploying to target "s3-bucket-deployment" (s3://<your-bucket-name>?region=us-west-2)
Identified 82 file(s) to upload, totaling 2.8 MB, and 0 file(s) to delete.
Success!
Once deployed, you should now be able to see your site rendering properly when navigated to your generated bucket endpoint.
Example: http://<your-bucket-name>.s3-website-us-west-2.amazonaws.com/
Now that the content has been deployed and you have verified the generated bucket endpoint works as expected, it is now time to configure the DNS in order to route traffic through a proper registered domain name (ex: timkipp.com)
I made use of Route 53 to register my domain name and setup my public hosted zone.
Once my public hosted zone was created, I added a new A record that was aliased to my S3 website endpoint.
Route 53 Zone
Reference: https://registry.terraform.io/providers/hashicorp/aws/latest/docs/resources/route53_zone
resource "aws_route53_zone" "site" {
name = "<your_domain_name_here>"
}
Route 53 Record (A)
Reference: https://registry.terraform.io/providers/hashicorp/aws/latest/docs/resources/route53_record
resource "aws_route53_record" "a" {
zone_id = aws_route53_zone.site.zone_id
name = "<your_domain_name_here>"
type = "A"
alias {
name = aws_s3_bucket.site.website_domain
zone_id = aws_s3_bucket.site.hosted_zone_id
evaluate_target_health = true
}
depends_on = [ aws_s3_bucket.site ]
}
Once the DNS propagates, you should now be able to access my static site by accessing the registered domain name.
I did not want to have to worry about doing deployments manually from my local machine every time (even though Hugo makes it so simple). In the off chance other people contribute to my site in the future, I wanted to set up automation so that myself and other contributors do not have to worry about setting up local AWS credentials, getting the proper access to push to the S3 bucket, and so on, just to push new content to the site.
For this, I am utilizing Github Actions to perform the deployment whenever new code changes are pushed to the main branch.
The following assumes you are using Github as your SCM provider.
I added a file named deploy-s3.yml
under the .github/workflows
directory (might not exist yet). This file will represent the workflow action that will be triggered whenever changes are pushed to the main
branch of my Github repository.
Make sure you add the following Github Secrets to you repository:
name: Deploy to S3
on:
push:
branches:
- main
jobs:
build-and-deploy:
name: Build and Deploy
runs-on: ubuntu-latest
steps:
- name: Checkout Source Code
uses: actions/checkout@v2
- name: Download Hugo
env:
HUGO_VERSION: 0.79.1
run: |
HUGO_PATH=hugo_extended_${HUGO_VERSION}_Linux-64bit.tar.gz
wget https://github.com/gohugoio/hugo/releases/download/v${HUGO_VERSION}/${HUGO_PATH}
tar xvzf ${HUGO_PATH} hugo
mv hugo $HOME/hugo
- name: Download Hugo Theme
run: git submodule sync && git submodule update --init
- name: Build Site
run: HUGO_ENV=production $HOME/hugo -v --minify
- name: Deploy Site
env:
AWS_ACCESS_KEY_ID: ${{ secrets.AWS_ACCESS_KEY_ID }}
AWS_SECRET_ACCESS_KEY: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
run: $HOME/hugo -v deploy --maxDeletes -1
Once you have committed this workflow file, any changes that are pushed into the main
branch of your repository should automatically trigger a deployment.
You are now ready to start focusing more on your content and less on the deployments. Happy creating!