1
votes

As per aws documentation, https://docs.aws.amazon.com/AWSSimpleQueueService/latest/SQSDeveloperGuide/standard-queues.html#standard-queues-at-least-once-delivery

Amazon SQS stores copies of messages on multiple servers for redundancy and high availability.

My Case: I have integrated my standard queue with the lambda function. When ever a new message comes to the queue lambda function will be invoked. Since AWS Lambda will continue to increase the number of concurrent function executions according to the queue size, If my queue size is 1000 according to that, number of concurrent executions of lambda function will also increased. In that case, is there any chance of multiple workers processing the same message copy by receiving from multiple servers at a time?

I have gone through the question: AWS: multiple instances reading SQS But there I didn't found the concept of storing the message copy on multiple servers.

3
When you say "I'm triggering my sqs with a lambda function", are you saying that Lambda sends a message to SQS, or that you are triggering a Lambda function to run whenever a message is sent to SQS? What do you mean by "my lambda function having multiple instances"? Feel free to Edit your question to provide more details. - John Rotenstein
@JohnRotenstein, I have edited my question. I mean to say when ever a new message arrives to queue, lambda function will be invoked. and "lambda function having multiple instances" means number of concurrent executions of lambda function according to the queue size. - Jay

3 Answers

1
votes

Even a single instance of your Lambda function may receive a duplicate message. SQS makes a "best effort" to deliver a message only once, but makes no guarantees. If you want a guarantee that a message will be delivered only once, you would need to use SQS FIFO queues, but those don't support Lambda triggers at this time.

0
votes

The possibility of some code receiving the same message arises from the possibility that a ReceiveMessage call might happen simultaneously and might hit separate SQS servers.

While it isn't documented, I would suspect that the mechanism that triggers a Lambda function from an SQS queue (which is quite new) would not have this behavior since AWS is responsible for triggering Lambda, as opposed to an external process calling into SQS.

However, there is no documentation either way to confirm whether it might, or won't, happen.

0
votes

Standard queue can deliver duplicate message. The questions that you have posted is standard messaging queue issue that everyone face while working with jms or any other queue messaging.We also had similar situation where in destination service was ECS instead of lambda. We are using unique identifier in database to check the status of submission to verify whether the message is processed by other nodes or not.

FIFO could have been choice to solve your problem but currently it is not supported by lambda functions.