Brief notes from AWS Community Day Nordics

On last Wednesday the tradition began at Clarion Helsinki. Over 300 AWS enthusiastics witnessed the arrival of the AWS Community Days to Nordics. The structure of the one day event was following to the typical seminar day, with keynotes and multitrack sessions. Slides are expected to be available afterwards. Tracks were titled with ‘Main’, ‘Serverless’, and ‘Data + ML’ and for the two latter ones the content were pretty self-explanatory. I participated in Serverless track but unfortunately, I was not able stay for the whole day due to the other commitments. The serverless approach was also present in the first keynote where Martin Buberl from Trustpilot discussed the journey they have had towards serverless architecture. The number of EC2s has been decreased as tasks have been processed by Lambdas instead. And as a EC2 is typically replaced by more than single Lambda, it is expected that the number of Lambdas is growing faster than the number of EC2 is shrinking.

The first session at Serverless track was hosted by Paul Lyons from Nordcloud and it focussed on Serverless framework. The demo was about serverless backend but the content was not the most interesting element. The framework was somewhat familiar for me in advance but the discussions strengthened my understanding that for the serious serverless approach the Serverless framework is a great tool. Serverless Framework can be used with various plugins and the related community is active. For example there is plugin for a rather new service called AppSync (my personal favorite service currently). Paul mentioned during his show that S3 event trigger for Lambda is not working well at the moment with Serverless and one have to activate that in the console. Using console is something that is not preferred when using Serverless…

Fellows from Zalando, Uri Savelchev & Ruben Diaz, discussed about Kubernetes. They have built custom deployment tool as they had found all available tools cumbersome to some extent, at least out-of-the-box. Zalando has independent teams and these teams operate in various sites in Europe. As a result they have numerous AWS accounts. Zalando seems to utilize AWS a lot.

It was a great day and I got lots of new ideas. Hopefully the next Community day is not too far away.


CodePipeline & CodeBuild & S3 website

Single Page Applications (SPA) are convenient as they provide a smooth user experience. For SPAs React is a good choice. But this blog is not about React, this is about AWS services that can be used in automatic deployment of SPA.  AWS services to be discussed are CodePipeline, CodeBuild,  CodeCommit, and S3.


S3 is appealing service not only from a storage perspective but also due to the possibility to configure it to work as a static website; combining low price and high scalability.  S3 website is also a ‘Serverless’ approach. The lack of IP address can be handled with Route53 (using S3 website as an alias). But lets move on and say the name of the static-website bucket is


The CodePipeline can be used as the main framework to address the continuing development. The pipeline can include several phases and each of the phases can be one of the handful of types. A basic pipeline contains just the source of the code (CodeCommit) and a buildphase (CodeBuild). There is no need to set any deployment phase. CodePipeline stores all output artifacts of the phases in S3 bucket and if those artifacts are used for example by Lambda in another phase of the pipeline, IAM policy with suitable permissions to the codepipeline-bucket should be attached to the Lambda’s service role.


Integration between CodeCommit and (for example) Git makes it convenient starting point of the pipeline. Once user credentials for IAM user are set up, the user is able to connect the CodeCommit. The CLI-usage of the CodeCommit is not different than Git as the user experience is exactly the same. Setting up CodeCommit as the source for CodePipeline is presented in figure 1.

Figure 1. Setting up CodePipeline


CodeBuild project can be setup through CodeBuild console and then it is possible to select existing CodeBuild project in the CodePipeline console. However, creating the CodeBuild project through CodePipeline console tackles some issues relating permissions and odd errors. CodeBuild will be invoked by CodePipeline not CodeCommit. Once the CodeBuild project has been created through CodePipeline console, the source is correct (CodeBuild project has the Current source: AWS CodePipeline) . Setting up the CodeBuild through CodePipeline is presented in figure 2.

Figure 2. Creation of CodeBuild project through CodePipeline console

The heart of the CodeBuild project is the buildspec.yml file. The build process is divided into several phases that can be used to run custom commands and scripts (check the example of buildspec.yml file below). The formal syntax of the yml-file is crucial, and it seems to be typical that syntax-errors are not necessary very conveniently identified from the logs. So make sure all those spaces are correct! As shown below (post-build phase), the build files are copied to bucket. The sufficient IAM policy with permission to access the bucket should be attached to the CodeBuild service role.

version: 0.2




      - npm install



      - npm run build



      - aws s3 sync --delete build/ s3:// --acl public-read

Also notice that the version number (0.2) is not a random number. It is a static number defined by AWS.