I am a big fan of particle.io and was very excited when they added a Google Cloud Platform (GCP) integration so I can save my IoT data into a GCP "DataStore".
I've followed their tutorial and got it working but I need some advice on implementing this so it can scale on GCP.
My current implementation is like so: https://docs.particle.io/tutorials/integrations/google-cloud-platform/#example-use-cases
Basically I have a GCP "Compute Engine" instance which runs a node.js script that listens for the PubSub events (sent by my IoT devices) and saves it to DataStore.
Now because I want it to scale, ideally this node.js script should run on a managed service that can respond to spikes automatically. But GCP does not seem to have anything like this.
In AWS I could so this: IoT Data -> Particle.io AWS WebHook -> AWS API Gateway Endpoint -> AWS Lambda -> AWS DynamoDB
All the AWS points are managed.
What's the best way to have that node.js script always running in a fully-managed, always-available way on GCP? which can run my node.js script that listens for PubSub events and saves to the DataStore and automatically scales as load increases
Any help/advice will be appreciated.
Thanks very much, Mark