14 Mar 2016

Apps from S3 for Docker enabled AWS Elastic Beanstalk

AWS facilitates Docker container management and deployment extremely well, not only via its dedicated EC2 Container Service (ECS), but with easy integration options for both OpsWorks and Elastic Beanstalk too.

In this post we're going to look at automating deployment of a php website from a secure S3 bucket into a Docker enabled Elastic Beanstalk environment.

There are two key modifications needed to enable the flow. The awscli tools are required to be installed within the container in order to copy the app out of S3 and into the (container) web root. Also, in order for the awscli tools to be able to do their thing a modified IAM role needs applying to the container host.

Installing the awscli tools into the container via a Dockerfile is a simple process and in the following example we are using the (Ubuntu based) apache-php image from Tutum. It is assumed that your app has been pre-zipped and uploaded into S3 (manually, or via a CD pipeline as discussed later) :

FROM tutum/apache-php
MAINTAINER Your Name <[email protected]>

# Set the env variable DEBIAN_FRONTEND to noninteractive
ENV DEBIAN_FRONTEND noninteractive
# Update and upgrade then install python
RUN apt-get update -y
RUN apt-get upgrade -y
RUN apt-get install unzip -y
RUN apt-get install python -y
RUN apt-get install python-pip python-dev build-essential -y
# Install the awscli tools
RUN pip install awscli
RUN pip install boto
# Expose required web port
EXPOSE 80
# Pull zipped website application from S3 and install to web root
RUN aws s3 cp s3://your-bucket-name/website.zip /var/www/html
RUN cd /var/www/html && unzip website.zip && rm -rf website.zip


And for completeness the accompanying Dockerrun.aws.json :

{
  "AWSEBDockerrunVersion": "1",
  "Image": {
    "Name": "tutum/apache-php",
    "Update": "true"
  },
  "Ports": [
    {
      "ContainerPort": "80"
    }
  ]
}

As required by Elastic Beanstalk, zip these files together, upload into S3 and specify during environment creation.

Next, the role. 

You would expect default IAM policies for the aws-elasticbeanstalk-ec2-role to facilitate copying items from secure S3 buckets, however they don't. As such a policy addition is required and the following will suffice (Name it what you like, we call ours 'AWSElasticBeanstalkS3App') :

{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Sid": "Stmt1457953232000",
            "Effect": "Allow",
            "Action": [
                "s3:GetObject",
                "s3:ListBucket"
            ],
            "Resource": [
                "arn:aws:s3:::*"
            ]
        }
    ]
}

With the policy created apply it to the aws-elasticbeanstalk-ec2-role along with defaults :


And that's about it. Now when the Docker enabled AWS Elastic Beanstalk environment is launched the container gets dynamically built on the role enabled host from the Dockerfile, using awscli tools to pull the zipped app from your secure S3 bucket and expanding/installing it to the (container) web root. 
You could use your own pre-baked Docker image with the awscli tools already included, or a more customised Dockerfile with granular specifics for your Apache and PHP configuration and the tools install tagged on as shown, however we are long time Tutum users and would highly recommend their official images for the majority of use cases.

Of course your Docker container deployment pipeline may differ, perhaps with a localised /src containing webfiles for volume mount/test and resulting 'dockerbuild', however there are scenarios where a centralised S3 repo could be useful, it all depends on your workflow. 
For example we use Thoughtworks GoCD extensively for continuous delivery, with components for certain apps to compress and upload into secure S3 buckets from a Git push. Staged deploy into Elastic Beanstalk is then incorporated as part of the on-going pipeline using awsebcli tools installed on the GoCD server.

If you would like more information on methods detailed in this post, or any other aspects of DevOps and Infrastructure management for the AWS Cloud, please feel free to get in touch via the main website.

No comments:

Post a Comment