Using S3 as a source action for CodePipeline

 

Rather new feature is to set S3 bucket to trigger CodePipeline (AWS blog).  There have been around scheduled pulls option for CodePipeline and S3 connection but since March 2018 also ‘push’ option is possible. There are good references available to set CodePipeline and that is therefore skipped in this blogpost. Lets run through a scenario where a new version arrives by Lambda. Let say the s3 bucket to be used is s3sourcebucket and the code is in the file.zip.

At first, versioning has to be turned on for S3 bucket. CodePipeline job can be identified by the version identifier. So after versioning turned on the S3 bucket is all set.

Then the focus is on CodePipeline. Lets use the CodePipeline console to configure the s3://s3sourcebucket/file.zip as the code source as (according to the documentation) CloudTrail and CloudWatch Events are automatically created and linked to the CodePipeline. Whenever new version is uploaded to the bucket, CloudTrail logs the API call, CloudWatch Event rule catch the CloudTrail logged trail and invokes the correspondent CodePipeline.

Below is an example of the automatically generated CloudWatch event rule. As default,  event pattern is built to match the entire event.

{
  "source": [ "aws.s3" ],
  "detail-type": [ "AWS API Call via CloudTrail" ],
  "detail": {
    "eventSource": [ "s3.amazonaws.com" ],
    "eventName": [ "PutObject" ],
    "resources": {
       "ARN": [ "arn:aws:s3:::s3sourcebucket/file.zip" ]
       }
    }
}

As the snippet points out, CloudWatch event rule is looking for “PutObject” as an EventName. That is the correct event name for ‘put’, as if IAM user is uploading a fresh file to the bucket. However, if the file transfers to the bucket by a copy operation, as if Lambda copies the file from somewhere to the bucket, then the EventName is not match. And the CodePipeline won-t be invoked. The correct event name is “CopyObject” for that occasion.  As a general hint for debugging, double-check the CloudTrail log for the correct event name and confirm that the same phrase is used in the CloudWatch Event rule.

-Tero

CodePipeline & CodeBuild & S3 website

Single Page Applications (SPA) are convenient as they provide a smooth user experience. For SPAs React is a good choice. But this blog is not about React, this is about AWS services that can be used in automatic deployment of SPA.  AWS services to be discussed are CodePipeline, CodeBuild,  CodeCommit, and S3.

S3

S3 is appealing service not only from a storage perspective but also due to the possibility to configure it to work as a static website; combining low price and high scalability.  S3 website is also a ‘Serverless’ approach. The lack of IP address can be handled with Route53 (using S3 website as an alias). But lets move on and say the name of the static-website bucket is www.myraectapp.com.

CodePipeline

The CodePipeline can be used as the main framework to address the continuing development. The pipeline can include several phases and each of the phases can be one of the handful of types. A basic pipeline contains just the source of the code (CodeCommit) and a buildphase (CodeBuild). There is no need to set any deployment phase. CodePipeline stores all output artifacts of the phases in S3 bucket and if those artifacts are used for example by Lambda in another phase of the pipeline, IAM policy with suitable permissions to the codepipeline-bucket should be attached to the Lambda’s service role.

CodeCommit

Integration between CodeCommit and (for example) Git makes it convenient starting point of the pipeline. Once user credentials for IAM user are set up, the user is able to connect the CodeCommit. The CLI-usage of the CodeCommit is not different than Git as the user experience is exactly the same. Setting up CodeCommit as the source for CodePipeline is presented in figure 1.

Figure 1. Setting up CodePipeline

CodeBuild

CodeBuild project can be setup through CodeBuild console and then it is possible to select existing CodeBuild project in the CodePipeline console. However, creating the CodeBuild project through CodePipeline console tackles some issues relating permissions and odd errors. CodeBuild will be invoked by CodePipeline not CodeCommit. Once the CodeBuild project has been created through CodePipeline console, the source is correct (CodeBuild project has the Current source: AWS CodePipeline) . Setting up the CodeBuild through CodePipeline is presented in figure 2.

Figure 2. Creation of CodeBuild project through CodePipeline console

The heart of the CodeBuild project is the buildspec.yml file. The build process is divided into several phases that can be used to run custom commands and scripts (check the example of buildspec.yml file below). The formal syntax of the yml-file is crucial, and it seems to be typical that syntax-errors are not necessary very conveniently identified from the logs. So make sure all those spaces are correct! As shown below (post-build phase), the build files are copied to www.myreactapp.com bucket. The sufficient IAM policy with permission to access the bucket should be attached to the CodeBuild service role.

version: 0.2

phases:

  pre_build:

    commands:

      - npm install

  build:

    commands:

      - npm run build

  post_build:

    commands:

      - aws s3 sync --delete build/ s3://www.myreactapp.com --acl public-read

Also notice that the version number (0.2) is not a random number. It is a static number defined by AWS.

-Tero