Deployment Management with Bitbucket Pipeline and AWS CodeDeploy

Erhan Aşıkoğlu
13 min readJul 22, 2019

--

Bitbucket Pipeline integration with AWS CodeDeploy
Bitbucket Pipelines + S3 + CodeDeploy Integration

Today, although many ideas / methods such as Continuous Integration (CI) & Continuous Delivery (CD) processes, container structures, serverless architectures have come up, I’m pretty sure that some organizations usually use manual ways to update their development enviroments such as test, mostly production.

In this article, I’ll try to discuss briefly Continuous Delivery and Continuous Deployment processes using Bitbucket pipeline service. After that, we will be able to work on EC2 with AWS CodeDeploy service and complete the Continous Deployment process. First of all, let’s briefly talk about this Continuous thought and its differences.

Continuous Integration (CI)

CI is the process that a development/changes/fixes should go through before being imported to the main project. Combining the relevant code in the master branch involves processes such as running various tests (unit, integration, etc.) on the compiled code. The benefits of this process are that both ready, stable and tested at any time with automated tests run after each change to the repository. Keep this in your mind, developers should prepare & include the relevant tests for each change.

Continuous Delivery (CD)

In a nutshell; it is the process focusing on delivering the packages prepared during CI to the customer in the fastest way. Additionally, automated tests that we obtain from the CI process as well as included the automated-release processes. It is the process where we have packages ready to be deployed to different environments (such as Development, Test or Production) with a single click. Thanks to this process, you can prepare distribution packages much more frequently and quickly; and it allows to introduce with customer in a more practical way.

CI/CI

The CI and CD processes are basically the basic software development requirements that enable the automated assembly of processes such as compilation, packaging, and publishing which must be done step by step, and then guaranteed by automated tests and released in the required environments with just one click.

Continuous Deployment

Continuous Deployment process, which takes CI&CD processes one step further, aims to meet the customer’s code in a live environment without any human influence. Automated testing processes are the deployment processes of the product, which survives the compilation and packaging processes. In this article, we will perform this process through AWS CodeDeploy.

CI/CD & Continuous Deployment

How BitBucket get involved?

After acquiring information about the processes, where does bitbucket work? Bitbucket is the most familiar code storage process; Provides an easy to use front face, version control systems such as Git in a team that allows you to use the product in an interactive way. Pipeline service is available as well as version control.

Pipeline Service

It is a CI / CD service that can be integrated into Bitbucket. A configuration file created in your repository allows you to automatically compile, test and even distribute your project to different environments (such as AWS EC2, Azure, DigitalOcean).

AWS CodeDeploy

It is one of the AWS developer services and allows you to automatically deploy your application packages onto AWS EC2, serverless AWS Lambda functions or your own servers.

After giving brief information about the services we plan to use, and the flow that we are trying to build up, now it is time to move on to the technical part.

Creating a Pipeline on BitBucket

Pipelines Service in BitBucket

Before diving into details, you can have a look at the configuration files and script examples for reference at github.

Assuming that you already have at least one repository on Bitbucket; When we select the pipeline option on the repository, Bitbucket will provide ready-made configuration files in YML format in different programming languages so that we can create the pipeline. In this article we will create pipeline steps of a simple Nodejs application developed with Javascript.

Basic Template for bitbucket-pipelines.yml (Nodejs )

Basically, you can create and commit file through BitBucket’s UI or create and store the file as manually under the root folder of your project for the next push.

Let’s have a look at file format quickly:
image: The values ​​we define in this section are docker images. We can use either a single image during the pipeline or define totally different images within the steps depending for the case of need in the process. The fact the docker engine obviously willing this process as a ready to use.

default: This is the first section to be executed with every push request to the repository, it is not possible to trigger this as manually. Except for default, definition blocks can be created on branch basis.

step: Taking the copy of the repository we are working on; The basic step of the pipeline that runs the docker image specified in default or step and runs the defined scripts.

caches: The caching keyword that keeping dependencies is downloaded by tools such as npm or maven. The tools using for the next process in the bitbucket server.

script: The part of the pipeline that we add to run scripts such as build, test and deploy on docker images.

Further information check atlassian website, keywords section .

The flow we want to see on the pipeline is as follows: install, build, test, package, zip, upload s3 and Trigger CodeDeploy as showed at the first diagram of this post. Bitbucket-pipelines.yml file prepared according to this flow can be seen as in below :

Our pipeline consists of 4 steps,

  • Cloning the code from the repository, downloading relevant commitments, running tests. The first step should always be automatic, the trigger: manual option that you will see in step 4 is invalid for the initial step.
  • Downloading Grunt packages, executing related processes, minimizing js and css, changing constants, creating the zip to load the resulting code into s3. Since all the operations up to this stage occur in the tmp file under bitbucket, we store the files in the same folder after compressing them.
  • Transfer compressed project to AWS S3 via using upload commands. As you can see on the code, a different docker image is used for this process: image: atlassian / pipelines-awscli.
  • When upload process is completed, last step invoking CodeDeploy to transfer build package from s3 to relevant Ec2 instance(s) and to run/validate related bash scripts before/after installation.

trigger:manual

This command in the last step allows us to manually trigger the corresponding step in the pipeline. In this way, after the build process is finished and the installation process is completed successfully, it is aimed to send and deploy changes in the environment under our control. Thanks to this feature, we have completed Continuous Integration, Continuous Delivery. By automating this for any environment (eg test environments), we can incorporate the Continuous Deployment into our processes.

As it is understood from the process, the pipeline operations that we tried to sample above are synchronous. Each step is waiting for the next one to be completed. Since the Bitbucket Pipeline service is charged at runtime/minute, you can use the parallel command, which provides asynchronous operation for your long-running steps that are not linked to the next step:

pipeline parallel work

Environment Variables

Another feature related to pipeline on bitbucket is that it offers the opportunity to define environment variables(repository variables) on bitbucket machines where the commands we run. We can define those variables for each repository. In this way, the keywords used in the steps can be defined as repository variables by using bitbucket settings instead of keeping the variables that must be kept hidden in the YML file. You can also restrict access to these definitions by privileges through BitBucket user permissions. Defining such environment variables also increases the readability of the code.

Bitbucket Pipelines Envorinment Variables Settings Page

For example, by defining variables such as AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY in Bitbucket-> Settings-> Pipelines-> Repository Variables, then you can store your credentials for your AWS account. Rest of the variables will be updated after AWS Codedeploy configuration has been done.

Integrate your Pipeline with AWS CodeDeploy

Both build and running the automated tests step on bitbucket have done as we mentioned before, it’s time to initialize our project in the relevant EC2 instance by using AWS CodeDeploy service. With this section, we have completed steps 1 and 2 of the diagram at the beginning of this post.

Let’s explain the purpose of using CodeDeploy during this integration :

  • AWS CodeDeploy is a built-in service that can work with it’s internal pipeline services (CodeCommit, CodeBuild,CodePipeline) as well as can be triggered externally as a service by other pipeline services such as Bitbucket.
  • Thanks to CodeDeploy, we are able to deploy our previously prepared packages from S3 environment to related EC2, Lambda, ECS patform services.
  • CodeDeploy service needs an application specification file named as an appspec.yml. This file should be located in the project root folder just like the bitbucket-pipelines.yml file. In the Appspec file, we can specify the commands we want the CodeDeploy service to run in pre-build, post-build, project start, project stop, or error situations.
  • When using CodeDeploy, we first need to define the “Application” object and the “DeploymentGroup” under it. Those steps will be explained in detail.

Appspec File Structure

This is the application specification file that allows us to perform multiple of operations before and after on environments which the application will be installed, allows the app to manage the entire life cycle after installation. With Hooks definitions that vary according to platforms (EC2, ECS, Lambda), you can manage the actions that should be taken before starting application (ex:installing dependencies, build code), during start (ex: start parameters) or to stop application. It is a kind of GUI for managing your application life-cycle in its environment.

More information about AppSpec is available in the documentation on AWS. Be sure to also look at the chapter for hooks that affect the application lifecycle.

The appspec.yml file that we will use in our application is as follows:

version: 0.0
os: linux
files:
- source: ./
destination: /home/ec2-user/my-app
hooks:
BeforeInstall:
- location: scripts/before_install.sh
timeout: 300
runas: root
AfterInstall:
- location: scripts/after_install.sh
timeout: 300
runas: root
ApplicationStop:
- location: scripts/stop_server.sh
timeout: 120
runas: root
ApplicationStart:
- location: scripts/start_server.sh
timeout: 120
runas: root

You can find the project example for file structures, script examples in github.

The order of the configurations/setup in AWS services would be as follows:

  • Creating user account for Bitbucket Client.
  • Creating a new bucket for storing transferred code in S3.
  • Generating a role for CodeDeploy service.
  • Generating a role for EC2 services and assigning them to instances.
  • Creating and configuring Application & Deployment Group under CodeDeploy service.

1- Creating User Account

Note that the Bitbucket Pipeline service is acting as client for the AWS platfom. For this reason, an external service must have relevant credentials before it can access resources within AWS. First, we will create a user with programatic access using AWS IAM to enable the Bitbucket Pipeline service to access AWS with permissions :

  • AmazonS3FullAccess → to be able to upload build & tested code to S3.
  • AWSCodeDeployFullAccess → to be able to trigger CodeDeploy from pipeline steps ( last step in our example).

Not only you can create these permissions under a group before creating a user, and then assign that group to the user; but also you can add these permissions when creating a user.

BitBucket user generation on AWS IAM.
BitBucket Pipeline user with permissions.

Since we need the ACCESS and SECRET keys of the user (BitBucketPipelineUser) that we will create, the user must have programatic access and the relevant credentials should be stored when the creation process is completed. (We will then use this information in the Bitbucket Environment Variables section described above).

Please add the credentials of the user to the BitBucket Repository Variables section as AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY.

2- Creating S3 Bucket

Then, we need the s3 bucket where the Bitbucket Pipeline will send our codebase. When creating this part, you need to pay attention to that instance and s3 should be in the same region. In this demo s3 bucket bitbucket-pipeline-codedeploy-demo is created under the region eu-west-1.

3- Defining Role For CodeDeploy

As mentioned before to be able to use CodeDeploy we need to create Application → DeploymentGroup objects and while configuring DeploymentGroup we should indicate service role that grants AWS CodeDeploy access to your EC2 or lambda whatever platform that you are using for Deployment. To be able to manage CodeDeploy to reach EC2 instances we need to create a role(CodeDeployRoleForDemo) with AWSCodeDeployRole permission as below :

Creating AWS CodeDeploy Role

4- Role Definition for EC2 instances

Permissions needs to be granted for EC2 machines to be able access S3 services. We can define permission in 2 ways as we can select from predefined list of permission provided by AWS or we can use policies. After you create a new policy in IAM, you can add the following policy in JSON format. I prefer to create new policy to be able have only required action permissions on required resources. (policy/BitBucketCodeDeployPolicyDemo)

{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "VisualEditor1",
"Effect": "Allow",
"Action": [
"s3:Get*",
"s3:List*"

],
"Resource": [
"arn:aws:s3:::bitbucket-pipeline-codedeploy-demo/*"
]
}
]
}

After creating the policy, we need to define the new role and assign this policy to the relevant role. Since the definition we need is to make the EC2 service accessible to the S3 service, we select EC2 as the service to use this role from the role screen and attaching previously created policy to role.

Attach policy to role.

We need to modify one more property in role before assigning it to our instances, trusted entities. We should add codedeploy service as a trusted service to our trusted entites with region information ( in our case eu-west-1) as below.

{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Principal": {
"Service": [
"codedeploy.eu-west-1.amazonaws.com",
"ec2.amazonaws.com"
]
},
"Action": "sts:AssumeRole"
}
]
}

5- CodeDeploy — Application & DevelopmentGroup Creation.

The first time we open CodeDeploy, it will ask us to create an Application. Only selection will be over the platform on which the application will be effective. In our scenario we chose EC2.

After creating Application, we’ll create DeploymentGroup, where we can build EC2 instance relationships, monitor deployments and events.

First start with assigning the role (CodeDeployRoleForDemo) that we created before. Then we need to select the deployment type. There are 2 types of deployment: Inplace and Blue / Green.

  • Inplace → It aims to deploy the latest revision of the build to the instances that we currently use in deployment group and during a deployment we may experience interruptionson each instances.
  • Blue/Green → Instead of updating current instances, it replaces the instaces in the deployment group with new instances and deploys the latest revision to them.

After selecting our Deployment Type (In place), we add our Ec2 instances or auto-scale group to DeploymentGroup. We then add one of the previously defined configurations or our own custom configurations from the deployment configuration section.

After selecting Deployment Type, EC2 instances or auto-scale groups needs to be added to the DeploymentGroup. Then we need to choose deployment configuration from predefined list or we can create our custom deployment configuration . You can find differences for predefined configurations in below table. For more information please visit AWS documentation.

Predefined deployment configurations

It is highly recommended to choose related LoadBalancer during Deployment Group configuration to be able to manage incoming traffic during the deployment. This configuration can be find as a latest step in DeploymentGroup configuration.

Deployment Group after configuration

Once we have done with configuration of Application & DevelopmentGroup afterward we need to update our BitBucket Pipeline Repository Variables with APPLICATION_NAME & DEVELOPMENT_GROUP_NAME as defined previously in Enviroment Variables section of Bitbucket Pipeline Service.

6- Final Step, Configuring Instances

With the creating and configuring our Development Group now we can ready to launch our new EC2 instances ( or modify them) in 2 step :

  • Attach Ec2 role that we created at point 4 for CodeDeploy & S3 interactions. [Note: if you have predefined roles in currently up&running instances as defined with profiles ( under ~/.aws/config), you should detach/remove them and combine with role created at point 4.]
  • installing CodeDeploy Agent ( please visit related AWS documentation for quick installation).

After completing AWS configuration go to the make some changes in your local project ( or just use sample project in github ) commit&push it to bicbucket then go to the Bitbucket Pipelines screen and monitor :

  • Steps that are running automatically in your pipeline, check pipeline console and time consumed during they are running.
  • Check your bucket for your project compressed version is uploaded or not
  • After auto steps completed run Deploy to AWS command and move to the CodeDeploy DevelopmentGroup to see your development status.

Conclusion

Congratulations, with a single commit, now you have a pipeline

  • where build your project
  • automated tests already covered,
  • dependencies managed
  • automatically deployed into your instances

without any human interaction (only deploy has a single “run” button, for test environment you can omit that and convert it to continuous deployment) thanks to Bitbucket Pipelines & AWS CodeDeploy.

Thanks for reading! Hope you liked it.

References:

--

--