I'm experimenting with dynamo db and lambda and am having trouble with the following flow:
Lambda A is triggered by a put to S3 event. It takes the object, an audio file, calculates its duration and writes a record in dynamoDB for each 30 second segment.
Lambda B is triggered by dynamoDB, downloads the file from S3 and operates on the 30 second record defined in the dynamo row.
My trouble is that when I run this flow, function A writes all of the rows required to dynamo, by function B
- Does not seem to be triggered for each row in dynamo
- Times out after 5 minutes.
Configuration
- Function B is set with the highest memory and 5 minute expiration
- The trigger is set with a batch size of 1 and starting position latest
Things I've confirmed
- When function B is triggered, the download from S3 happens fast. This does not seem to be the blocker
- When I trigger function B with a test event it executes perfectly.
- When I look at the cloudwatch metrics, function B has a nearly 100% error rate in invocation. I can't tell if this means he function was invoked and had an error or could not be invoked at all.
Has anyone had similar issues? Any idea what to check next? Thanks