Nikolas Knickrehm

2 min read

Syncing files to S3 using the Serverless Framework

Infrastructure as Code

Normally you would use the Serverless Framework to deploy services consisting of one or more Lambda Functions and other related resources of your cloud provider. A different use case could be to upload a bunch of files to an S3 bucket hosting a website. Here is a brief guide on how to do this.

Syncing files to S3 using the Serverless Framework

This can be considered and read as part two of my post on Deploying SwaggerUI with Terraform and S3 for Multiple Teams. You don't need to read the other post though because technically the problem addressed here can also be encountered and solved on its own. The files used as an example here are related to a Swagger UI setup but could just be anything else you might have in a Serverless project. Our setup requires to upload files to one S3 bucket but to separate directories depending on the stage of the Serverless deployment. The idea is that each stage containing an isolated deployment of our API should also have its own API documentation which we are keeping in the same repository. This way every API version, even pre-release ones, are documented for our consumers without needing to setup a complete Swagger UI every time.

The easiest way of syncing files to S3 in Serverless is to use the plugin serverless-s3-sync which can cover a variety of different use cases like ours. You could of course also write all functionality by your self but why would you when a good solution already exists. In case you still prefer to do it on your own, the plugin just adds a hook to Serverless triggered after the deployment was completed and uses the AWS CLI to upload files to S3 - no black magic involved! Using the plugin is a lot easier though and results in only a few more lines in your Serverless file.

A plugin to sync local directories and S3 prefixes for Serverless Framework.

First, you will need to install the plugin to your project using npm i serverless-s3-sync and include it in the plugins section of your Serverless file:

service: your-service 

  - serverless-s3-sync 

# ...

Now you can add a few more lines to the custom block of the file:

# ...

    - bucketName: your-bucket 
      localDir: the-directory-within-your-project 
      bucketPrefix: subdirectory-of-the-bucket/${opt:stage}/ # optional 
      deleteRemoved: false # optional 
      acl: public-read # we want the files to be public (you too?) 
      defaultContentType: application/json # helps setting up the correct mime types
# ...

As you might have guessed every file from within the directory you provide in localDir will be uploaded to S3. In our case we are using the stage name as a sub-directory in the bucket so that we will host a Swagger file for every stage and the URLs of these files are predictable. After setting all this up you can trigger a deployment like you normally do to see if it worked. In case you only want to test if the S3 upload is working you can also use the command sls s3sync to skip the deployment of the other resources and functions.

Next up

Want to stay up to date?