0
votes

I have a PHP web application that is running on an ec2 server. The app is integrated with another service which involves subscribing to a number of webhooks.

The number of requests the server is receiving from these webhooks has become unmanageable, and I'm looking for a more efficient way to deal with data coming from these webhooks.

My initial thought was to use API gateway and put these requests into an SQS queue and read from this queue in batches.

However, I would like these batches to be read by the ec2 instance because the code used to process the webhooks is code reused throughout my application.

Is this possible or am I forced to use a lambda function with SQS? Is there a better way?

2

2 Answers

2
votes

The approach you suggested (API Gateway + SQS) will work just fine. There is no need to use AWS Lambda. You'll want to use the AWS SDK for PHP when writing the application code that receives messages from your SQS queue.

I've used this pattern before and it's a great solution.

0
votes

. . . am I forced to use a lamda function with SQS?

SQS plus Lambda is basically free. At this time, you get 1M (million) lambda calls and 1M (million) SQS requests per month. Remember that those SQS Requests may contain up to 10 messages and that's a potential 10M messages, all inside the free tier. Your EC2 instance is likely always on. Your lambda function is not. Even if you only use Lambda to push the SQS data to a data store like RDBMS for your EC2 to periodically poll, the operation would be bullet-proof and very inexpensive. With the introduction of SQS you could transition the common EC2 code to Lambda function(s). These now have a run time of 15 minutes.

To cite my sources: