Demo Server

At midnight every day,  any updates to the main branch are automatically deployed to  http://demo.fcrepo.org/fcrepo.  Below you'll find a list of steps enable this process to occur.

  1. On commits to main,  a new docker image is created and pushed to Docker Hub (fcrepo/fcrepo).
  2. Every morning at 12:00am eastern,  a scheduled AWS CloudWatch event fires off and triggers the execution of a simple AWS Lambda function.
  3. The lambda function simply invokes a rebuild operation on the  AWS Elastic Beanstalk environment (FedoraDemo-env) which hosts our demo instance.

As a note, any content that has been uploaded to the demo instance is wiped clean nightly.

If you're curious about some of the details of how this works  - because you want to make changes to the existing demo setup or perhaps you want to do something similar with your own environment - here's how you set it up.

Building the docker image

The docker image is created and pushed using the this script which is triggered by the travis build. You'll notice that we use git to clone the fcrepo-docker project which contains the build.sh and push.sh scripts.

Setting up Elastic Beanstalk

Using all the defaults, simply create a new Elastic Beanstalk application and environment using the Docker environment with all the default settings.  When you're prompted to upload your application use this zip file: 

Inside you'll find a single file: 

Dockerrun.aws.json

{
  "AWSEBDockerrunVersion": "1",
  "Logging": "/var/log",
  "Image": {
        "Name": "fcrepo/fcrepo:6.0.0-SNAPSHOT",
    "Update": "true"
  },
  "Ports": [
    {
      "ContainerPort": 8080,
      "HostPort": 80
    }
  ]
}

Notice "Update" field this tells elastic beanstalk to pull the latest image from Docker Hub.

Create Your Lambda

The lambda configuration is similarly straightforward.  Simply go to AWS Lambda via the console and the following:

  1. Create a function from scratch
  2. Select python 3.8 as your runtime
  3. Create a function by copying the following code snippet  and save your function.

    import json
    import urllib.parse
    import boto3
     
    client = boto3.client("elasticbeanstalk")
     
     
    def lambda_handler(event, context):
         
        print("Received event: " + json.dumps(event, indent=2))
     
         
        environment_name = "FedoraDemo-env"
        try:
            response = client.rebuild_environment(
                EnvironmentName=environment_name,
            )
            print(f"Status Code =  {response.statusCode}")
            return response.statusCode
        except Exception as e:
            print(e)
            print(f"Error rebuilding environment: {environment_name}")
            raise e

  4. Give your function AWSElasticBeanstalkFullAccess permissions along with permission to execute the Lambda (which should happen by default).
  5. Notice the environment_name is hard-coded here.  This value can be whatever you want it to be as long as it matches your beanstalk environment name

Tie It Together with CloudWatch Events

In order to trigger your rebuild, go to AWS CloudWatch > Events > Rules.

  1. Create a rule
  2. Select Schedule
  3. Select Cron expression and enter a cron expression.  For everyday at midnight use 

    0 0 * * ? *

  4. Click Target, select Lamba  and then your function.
  5. Save it and you're done.


  • No labels