2
votes

Got a function triggered on kinesis stream messages (serverless.yml):

functions:
  kinesis-handler:
    handler: kinesis-handler.handle
    events:
      - stream:
          type: kinesis
          arn:
            Fn::Join:
              - ':'
              - - arn
                - aws
                - kinesis
                - Ref: AWS::Region
                - Ref: AWS::AccountId
                - stream/intercom-stream
          startingPosition: LATEST
          batchSize: 100
          enabled: true

The function does get triggered eventually (2-5 sec after the message is sent) but not immediately. Is this by design? Can I assume kinesis data streams are not good for (near) real time event driven architecture?

What actually triggers a lambda when the trigger is a kinesis stream? It looks like there's just background periodic polling every 1-2 sec, the lambda is triggered if new messages found in the stream.

1
Also noticed that same behaviour, for my use cases is fine a few seconds of delay, but i see this being a issue for a lot of people that expect real time - dege

1 Answers

0
votes

You have the batch size set to 100 which tell Lambda to read 100 records before invoking your function.

There are 2 settings related to batch.

  • Batch size – The number of records to read from a shard in each batch, up to 10,000. Lambda passes all of the records in the batch to the function in a single call, as long as the total size of the events doesn't exceed the payload limit for synchronous invocation (6 MB).
  • Batch window – Specify the maximum amount of time to gather records before invoking the function, in seconds.

Before invoking your function, Lambda continues to read records from the stream until it has gathered a full batch, or until the batch window expires.

I haven't done performance testing with these 2 setting but I would start by setting my size to 1 and my window to 0. However, there could be side effects from launching a large amount of Lambda's but it should give you the minimum delay possible.