I don't know another cloud provider than AWS, so I'll answer based on AWS. I always use AWS Lambda whenever I can except when the running time is greater than 15 minutes. In this case, I use AWS Batch (AWS Batch – Run Batch Computing Jobs on AWS).
You can also use AWS Fargate, but you'll have to configure clusters and a docker image.
EDIT 1:
Batch can be sent events via API Gateway like you would to Lambda I assume?
I've never triggered a Batch Job via API Gateway directly (I don't know if this is possible). I've always used API Gateway to trigger a Lambda and Lambda trigger Batch (check out this workflow, please, to have a better idea).
Also, you may use AWS CloudWatch events to trigger an AWS Batch Job. For instance, if you upload a file to S3 before transcript, you may trigger AWS Batch Job by S3 events (check out this step by step, please).
How simple is it to convert a zipped Lambda function to a AWS Fargate image?
It's no so difficult if you know about Docker, AWS ECR and ECS clusters.
First, you need to create a Docker image with your source code. Check out this step by step, please. Basically, you'll unzip your code, copy to the docker image, run npm install
and run a command in a Dockerfile.
After that, you may create an AWS ECR in which you'll upload your Docker image.
Create an AWS ECS cluster
Create an AWS Fargate task
Finally, run the task via Lambda.
If you don't have experience with Docker and AWS Fargate, AWS Batch is easier to implement.
serverless
, you should consider if the task can be broken into smaller containers and chained together, say using something likeAWS step functions
– asr9