If you are currently struggling on the sysadmin page to keep all your servers up to date, it may be time to take some of the weight off your shoulders and automate your delivery process with a continuous distribution pipeline.
What is CI / CD?
Continuous integration, continuous distribution (CI / CD) is about making frequent (often daily) code updates, building and testing new releases and rolling out the changes on your production servers quickly and efficiently.
It̵7;s a very broad concept that encapsulates the core of the DevOps culture – streamlining the flow of new code from the developer’s brains and to your servers. CI / CD is usually implemented with a toolkit called a pipeline, which is a set of tools that automate the entire process from source to distribution.
This is what AWS provides with its CodeSuite tools, and they are in a particularly good place to implement such a pipeline, as you will usually run your production servers on EC2, making the implementation step much easier and well integrated.
AWS’s CodeSuite tool
CodeSuite consists of a few different tools. It starts with CodeCommit, AWS’s managed source control service. It’s a little clumsier and less functional than the competition, but it’s easy enough to set up Git with several remote controls that you can just as well use it if you plan to use the rest of the pipeline. CodeCommit has a very generous free level, so you probably won’t charge much for it.
Then comes CodeBuild, which takes source control from CodeCommit (or GitHub / BitBucket) and builds from source and runs all the tests you provide in the process. This uses an EC2 server to build, which you have to pay for while construction is running. Complex projects may require a powerful machine for fast buildings.
When construction is complete, your application is ready for distribution. This step is handled with CodeDeploy; you create a “distribution group” that can contain any number of EC2 instances or entire auto-scaling groups. This is where AWS’s pipeline really shines.
With CodeDeploy you can fine-tune how the distribution is handled – there are presets for everyone at once, half the group, 10% in a few minutes and many more, all of which are designed to minimize downtime due to unexpected bugs in production. Having all your servers updated automatically is nice enough, but CodeDeploy can even connect to your load balancer and reduce traffic to instances in the process of updating. Combined with a staggered rollout strategy to ensure a minimum number of healthy hosts, this makes production updates stress-free.
All of this is wrapped up in a single pipeline, which monitors your source control and triggers the pipeline to run automatically when you tap changes in the release branch, building, testing and distributing your code on all your servers.
To set up a pipeline
First you need to get your code to CodeCommit. We recommend that you set CodeCommit as a separate release remote control along with your primary source control. If you use Github or BitBucket, you can connect directly to your archive instead, but CodeCommit is a completely AWS solution and allows you to manage organizational access to server updates via the IAM console.
Then go to the CodePipeline console to get started. Click “Create Pipeline” and give it a name and description.
Each step in the pipeline needs some configuration. The first is the source step, which connects to CodeCommit, Github and BitBucket. The latter two require you to connect your accounts through OAuth, but CodeCommit connects directly. Select the repository you use and the branch for releases. If you use CodeCommit as a secondary remote control, you will probably choose the master here, but if you use a third-party provider, you may want to create a separate publishing branch.
Under the branch options, you can find the settings for how this pipeline runs automatically. By default, it is run each time a new engagement is pushed to the drop-down list you specified. You can change this, but it’s probably what you want.
Next up, the construction stage. CodePipeline supports Jenkins and the built-in CodeBuild for building code. If you are already using Jenkins to build, you will need to install the CodePipeline plugin to connect it to AWS. Otherwise, you can set CodeBuild by clicking “Create Project” to open a dialog.
CodeBuild has one pulp of things to configure, so you can read our guide to setting it up to learn more. When done, the dialog box should close and return to the CodePipeline setting.
The next step is implementation. CodePipeline supports a few different distribution options; above all, if you use AWS CloudFormation or Elastic Container Service, you can distribute updates directly to them. For general EC2 and Lambda distributions, you must use CodeDeploy.
CodeDeploy also requires a lot of configuration, so you can read our complete guide to configuring it. In short, you create a distribution group consisting of your EC2 servers, an automatic scaling group or Lambda functions and select a distribution configuration – all at once, half at a time, and so on. CodeDeploy handles everything else, adjusts your load balance so that traffic is not routed to updating servers, and always maintains a certain number of healthy hosts, so your service never goes down for updates.
When done, you will return to CodePipeline and select the installation you have just configured. It should highlight all the settings made, and you can click next to review your pipeline before starting it.
Once you have created the pipeline, the first building runs automatically. If there are any errors in the construction, the pipeline will stop and your servers will not be updated.
You can test the pipeline update detection by making a new commitment in your source control. The pipeline starts restarting automatically and rolls out changes to your servers if all goes well.
You can go back and edit the pipeline at any time or adjust the CodeBuild or CodeDeploy configurations. If you encounter build errors, make sure your buildspec file handles everything correctly.