1
votes

I currently have one AWS Lambda function that is updating a DynamoDB table, and I need another Lambda function that needs to run after the data is updated. Is there any benefit to using a DynamoDB trigger in this case instead of invoking the second Lambda using the first one?

It looks like the programmatic invocation would give me more control over when the Lambda is called (ie. I could wait for several updates to occur before calling), and reading from a DynamoDB Stream costs money while simply invoking the Lambda does not.

So, is there a benefit to using a trigger here? Or would I be better off invoking the Lambda myself?

3

3 Answers

4
votes

DynamoDB Stream seems to be the better practice because:

  • you delegate the responsibility of invoking the post-processor function from your writer-Lambda. Makes writer more simple (aka faster),
  • you simplify connecting new external writers to the same Table, otherwise you have to implement the logic to call post-processors in all of them as well,
  • you guarantee that all data is post-processed (even if somebody added a new item in the web-interface of DynamoDB. :)
  • moneywise, the execution time you will spend to send invoke() operation from writer Lambda will likely cover the costs of a stream.
  • unless you use DynamoDB transactions your data may still be not yet available for post-processor if you call him from writer too soon. If your business logic doesn't need transactions then using them just to cover this problem = extra time/cost.

P.S. You can batch from the DynamoDB stream of course out of the box with simple setting. You are not obliged to invoke post-processor for every write operation.

1
votes

After the data is updated, you can publish a SQS message, then add a trigger to configure another function to read from Amazon SQS in the Lambda console, create an SQS trigger.

To create a trigger

  1. Open the Lambda console Functions page.

  2. Choose a function.

  3. Under Designer, choose Add trigger.

  4. Choose a trigger type.

  5. Configure the required options and then choose Add.

Lambda supports the following options for Amazon SQS event sources.

Event Source Options

  • SQS queue – The Amazon SQS queue to read records from.
  • Batch size – The number of items to read from the queue in each batch, up to 10. The event may contain fewer items if the batch that Lambda read from the queue had fewer items.
  • Enabled – Disable the event source to stop processing items.
0
votes
var QUEUE_URL = 'https://sqs.us-east-1.amazonaws.com/{AWS_ACCUOUNT_}/matsuoy-lambda';
var AWS = require('aws-sdk');
var sqs = new AWS.SQS({region : 'us-east-1'});

exports.handler = function(event, context) {
  var params = {
    MessageBody: JSON.stringify(event),
    QueueUrl: QUEUE_URL
  };
  sqs.sendMessage(params, function(err,data){
    if(err) {
      console.log('error:',"Fail Send Message" + err);
      context.done('error', "ERROR Put SQS");  // ERROR with message
    }else{
      console.log('data:',data.MessageId);
      context.done(null,'');  // SUCCESS 
    }
  });
}

Please don't forget add a trigger from another function to this SQS topic. That function will receive the SQS message automatic to handle.