AWS Lambda

Terraform is an infrastructure-as-code tool written in go for building, changing, and versioning infrastructure safely and efficiently. I use it pretty much every day, and I really love it. It is very simple to learn, and allows you to keep your infrastructure clean and fully automated. I’ve mainly used it with AWS resources, so I can’t say much about other cloud providers, but as far as AWS is concerned, Terraform covers almost everything you need.

AWS Lambda is a compute service that lets you run code without provisioning or managing servers. AWS Lambda executes your code only when needed and scales automatically, and the best thing about it is that you pay only for the compute time you consume.

And obvisouly, you can manage your Lambda functions with Terraform.


So, what’s the use case here? You know sometimes when you think about small scripts that you need to run, or scheduled jobs that need to happen ? Well, most of the time, I would say that Lambda is a great place to run them, if not the best place.

Now, there are countless ways to manage your Lambda functions (and everything that goes with them: IAM roles, API Gateway config, etc …). For relatively big projects running on Lambda, with a lot of other AWS dependencies, I would recommend using the Serverless Framework, as it is very complete and is getting a lot of support from the community. It’s a great project.

But sometimes there’s only a super small script that you want to run, and you might not want to start learning that framework, or you might not want to use it for such a small piece of code. That’s the issue I had recently, so I decided to check out what Terraform could do for this.

Turns out it was really useful to me, I’m using this for several scripts running on Lambda now, so I figured I’d share it.

In the end, I just need to run one command to deploy my function.

Use case

The example I’ll use for this post is a super simple python script that checks if a file exists on S3. If the file is there, the function returns true, if it’s not, it returns false. Also, I want this script to run once a day, every day at 1am. What the code does is not the important thing here, really. All we care about is how you automate the whole thing.

So, the setup will look like this :

1. A Cloudwatch Event Rule configured to run at 1am every day. The Lamdba function is the target of that rule, and the target call has two input parameters: bucket and file_path.

2. The Lambda function that gets the S3 coordonates of the file from the input and checks if the file exists. The function needs to have read permissions for all the S3 buckets we want it to check.

As usual, you’ll find all the code I used for this post on my Github page, in this repository.

Let’s go


This is what you’ll need:

  • Terraform
  • Python 2.7
  • Zip

Let’s start with an empty folder for this project.

First of all, let’s configure a virtual environment for Python :

virtualenv venv
source venv/bin/activate
pip install boto3
pip freeze > requirements.txt

The Code

Now, here is the python code that I want to run on Lambda:

import boto3
import botocore

# Checks if the file exists
def check_file(bucket, key):
    s3 = boto3.resource('s3')
    file_exists = False
        s3.Object(bucket, key).load()
    except botocore.exceptions.ClientError as e:
        if e.response['Error']['Code'] == "404":
            file_exists = False
        file_exists = True
    return file_exists

# Main function. Entrypoint for Lambda
def handler(event, context):

    # Get Bucket Name
    bucket_name = event['bucket']

    # Get File Path
    file_path = event['file_path']

    # Check if exists
    if check_file(bucket_name, file_path):
        print "File {} exists.".format(file_path)
        return True
        print "File {} does not exist.".format(file_path)
        return False

# Manual invocation of the script (only used for testing)
if __name__ == "__main__":
    # Test data
    test = {}
    test["bucket"] = "my_bucket"
    test["file_path"] = "path_to_file"
    # Test function
    handler(test, None)

Let’s call that script

You might notice the last bit of code in the script. I use it for development purposes, so that I can run the function locally and test it (python

See something missing in this script? There’s no credentials! This is actually normal. Another awesome thing that AWS allows is to add roles and policies to our Lambda function. It means that if our function needs to access other AWS resources, we can simply attach an IAM Role allowing the access to it. This get automatically injected in Boto3 behind the scenes, which makes it entirely transparent to us. Pretty cool!


Alright, now that we have the code, it’s time to deploy it with Terraform. We’re going to create two files for this:

  • which will contain all the Terraform resources
  • where we define the variables we want to use

Let’s start with Super easy, here it only contains a variable defining the AWS Region where we want to deploy our Lambda function:

variable "aws_region" {
  default = "eu-west-1"

Now, I’m going to describe step by step the content of the file. You can check out the entire file here.

First of all we start by defining which provider we want to use (AWS obviously). We also define the region (by calling the variable created earlier).

provider "aws" {
    region = "${var.aws_region}"

I said that we wanted a function that gets triggered everyday at 1am. For that, we are going to use a Cloudwatch Event Rule:

resource "aws_cloudwatch_event_rule" "check-file-event" {
    name = "check-file-event"
    description = "check-file-event"
    schedule_expression = "cron(0 1 ? * * *)"

The rule is created, we now need to define a target for that rule. Here, it will be our Lambda function, but it also could be a bunch of other things (SNS, SQS, …).

resource "aws_cloudwatch_event_target" "check-file-event-lambda-target" {
    target_id = "check-file-event-lambda-target"
    rule = "${}"
    arn = "${aws_lambda_function.check_file_lambda.arn}"
    input = <<EOF
  "bucket": "my_bucket",
  "file_path": "path/to/file"

As you can see, we are refering to the lambda function that we are going to define later in the script. Terraform is intelligent enough to know in which order to create all the resources.

We are also defining an input, where we can add parameters. This is where we list the bucket and the file we want to check.

Earlier in the post I talked about attaching an IAM Role to the Lambda function. Let’s create that role now:

resource "aws_iam_role" "check_file_lambda" {
    name = "check_file_lambda"
    assume_role_policy = <<EOF
  "Version": "2012-10-17",
  "Statement": [
      "Action": "sts:AssumeRole",
      "Principal": {
        "Service": ""
      "Effect": "Allow",
      "Sid": ""

This is only the base definition of a role for a Lambda function. Now, we are going to create the policy that allows read-only access to S3, and attach it to the role.

data "aws_iam_policy_document" "s3-access-ro" {
    statement {
        actions = [
        resources = [

resource "aws_iam_policy" "s3-access-ro" {
    name = "s3-access-ro"
    path = "/"
    policy = "${data.aws_iam_policy_document.s3-access-ro.json}"

resource "aws_iam_role_policy_attachment" "s3-access-ro" {
    role       = "${}"
    policy_arn = "${aws_iam_policy.s3-access-ro.arn}"

We need to add one more thing to the role. It’s a default policy from AWS that allows the execution for the function:

resource "aws_iam_role_policy_attachment" "basic-exec-role" {
    role       = "${}"
    policy_arn = "arn:aws:iam::aws:policy/service-role/AWSLambdaBasicExecutionRole"

We also have to allow our Cloudwatch Event Rule to call our Lambda function:

resource "aws_lambda_permission" "allow_cloudwatch_to_call_check_file" {
    statement_id = "AllowExecutionFromCloudWatch"
    action = "lambda:InvokeFunction"
    function_name = "${aws_lambda_function.check_file_lambda.function_name}"
    principal = ""
    source_arn = "${aws_cloudwatch_event_rule.check-file-event.arn}"

Finally, we define our Lambda function. We’ll set a timeout of 10 seconds, and the runtime is python2.7.

We also set the function in the script that needs to be called: check_file_lambda.handler.

resource "aws_lambda_function" "check_file_lambda" {
    filename = ""
    function_name = "check_file_lambda"
    role = "${aws_iam_role.check_file_lambda.arn}"
    handler = "check_file_lambda.handler"
    runtime = "python2.7"
    timeout = 10
    source_code_hash = "${base64sha256(file(""))}"

You will notice that we are describing a zip file for our code. This is normal. We need to upload everything our function needs in a zip file. It will contain our script, but also its dependencies (python packages).


We created the function, we wrote the Terraform setup. The only thing we need to do now is to deploy! The steps for this are the following :

  1. Create a zip file with the function and the python dependencies
  2. Run terraform apply to deploy

Obviously, there’s no way we’re not going to automate that part. Here is a small bash script, called that will handle that for us.

set -e

# Get Virtualenv Directory Path
SCRIPT_DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )"
if [ -z "$VIRTUAL_ENV_DIR" ]; then

echo "Using virtualenv located in : $VIRTUAL_ENV_DIR"

# If zip artefact already exists, back it up
if [ -f $SCRIPT_DIR/ ]; then

# Add virtualenv libs in new zip file
cd $VIRTUAL_ENV_DIR/lib/python2.7/site-packages
zip -r9 $SCRIPT_DIR/ *

# Add python code in zip file
zip -r9 $SCRIPT_DIR/

# Run terraform apply
terraform apply

Alright, our setup is done!

The only thing we need to do to deploy our function is to run:



The first thing I want to point out in this conclusion is that everything we did here is entirely serverless which, when you think about it, is really awesome. We basically created a serverless cron job here.

With this automation, our code is only a few seconds away from being deployed to AWS. You can re-use that project for any Lambda function that you have, with just a few tweaks to adapt to your use case.

I hope you’ll enjoy using Terraform and AWS Lambda as much as I do!