10
votes

I have a terraform configuration that correctly creates a lambda function on aws with a zip file provided.

My problem is that I always have to package the lambda first (I use serverless package method for this), so I would like to execute a script that package my function and move the zip to the right directory before terraform creates the lambda function.

Is that possible? Maybe using a combination of null_resource and local-exec?

2

2 Answers

11
votes

You already proposed the best answer :)

When you add a depends_on = ["null_resource.serverless_execution"] to your lambda resource, you can ensure, that packaging will be done before uploading the zip file.

Example:

resource "null_resource" "serverless_execution" {
  provisioner "local-exec" {
    command = "serverless package ..."
  }
}

resource "aws_lambda_function" "update_lambda" {
  depends_on = ["null_resource.serverless_execution"]
  filename   = "${path.module}/path/to/package.zip"
  [...]
}

https://www.terraform.io/docs/provisioners/local-exec.html

4
votes

The answer is already given, but I was looking for a way to install NPM modules on the fly, zip and then deploy Lambda function along with timeout if your lambda function size is large. So here is my finding may help someone else.

#Install NPM module before creating ZIP

resource "null_resource" "npm" {
  provisioner "local-exec" {
    command = "cd ../lambda-functions/loadbalancer-to-es/ && npm install --prod=only"
  }
}

# Zip the Lamda function on the fly
data "archive_file" "source" {
  type        = "zip"
  source_dir  = "../lambda-functions/loadbalancer-to-es"
  output_path = "../lambda-functions/loadbalancer-to-es.zip"
  depends_on  = ["null_resource.npm"]
}


# Created AWS Lamdba Function: Memory Size, NodeJS version, handler, endpoint, doctype and environment settings
resource "aws_lambda_function" "elb_logs_to_elasticsearch" {
  filename      = "${data.archive_file.source.output_path}"
  function_name = "someprefix-alb-logs-to-elk"
  description   = "elb-logs-to-elasticsearch"
  memory_size   = 1024
  timeout       = 900
  timeouts {
  create = "30m"
  }
  runtime          = "nodejs8.10"
  role             = "${aws_iam_role.role.arn}"
  source_code_hash = "${base64sha256(data.archive_file.source.output_path)}"
  handler          = "index.handler"
  #  source_code_hash = "${base64sha256(file("/elb-logs-to-elasticsearch.zip"))}"


  environment {
    variables = {
      ELK_ENDPOINT = "someprefix-elk.dns.co"
      ELK_INDEX    = "test-web-server-"
      ELK_REGION   = "us-west-2"
      ELK_DOCKTYPE = "elb-access-logs"
    }
  }
}