Yes, that should be very possible. It depends a bit on what kind of resolver you want to use. If you were using a Lambda data source:
- As you mentioned, Lambda invocations can be asynchronous, so
you could execute one inside a Lambda resolver and see minimal additional overhead.
- You could use Lambda event sourcing. For example, earlier this year, Lambda launched SQS event-based Lambda invocations. What that means is, your Lambda resolver would put a message with some event context into an SQS queue (such as account info, email address...etc.), and then you'd have a Lambda that listens to that queue. It automagically polls, so it would invoke immediately and parse that message, and you'd send the email from that.
- There's additional Lambda event sources that you could find helpful based on personal preferences/application needs, all of which are visible in the docs. Specifically, you might find SNS and/or Kinesis to be viable options. Conceptually they're the same - your resolver pushes a message into some AWS service, and the event-driven Lambda will automatically be invoked.
If you were using a DynamoDB data source:
- You could set up DynamoDB streams on your table and hook up an event-driven Lambda that listens to that stream.
In any case, you might find pipeline resolvers to be useful. It allows you to set up a linear, synchronous chain of resolvers to solve some more complex problems. You could have an initial Lambda function/DDB lookup to get the number of posts (and potentially fail if someone is at their limit), then a second resolver after that to do your normal action, then potentially even a third that could send the email that someone has hit the limit.