0
votes

I am a big fan of particle.io and was very excited when they added a Google Cloud Platform (GCP) integration so I can save my IoT data into a GCP "DataStore".

I've followed their tutorial and got it working but I need some advice on implementing this so it can scale on GCP.

My current implementation is like so: https://docs.particle.io/tutorials/integrations/google-cloud-platform/#example-use-cases

Basically I have a GCP "Compute Engine" instance which runs a node.js script that listens for the PubSub events (sent by my IoT devices) and saves it to DataStore.

Now because I want it to scale, ideally this node.js script should run on a managed service that can respond to spikes automatically. But GCP does not seem to have anything like this.

In AWS I could so this: IoT Data -> Particle.io AWS WebHook -> AWS API Gateway Endpoint -> AWS Lambda -> AWS DynamoDB

All the AWS points are managed.

What's the best way to have that node.js script always running in a fully-managed, always-available way on GCP? which can run my node.js script that listens for PubSub events and saves to the DataStore and automatically scales as load increases

Any help/advice will be appreciated.

Thanks very much, Mark

1
google cloud functions are available, which is the equivalent of AWS Lambda. I think, it is in alpha. Check it out. The same serverless workflow can be established using GCF. - Lakshman Diwaakar
Yes, Cloud Functions are ideal for this but it still is in Preview and I need a Production ready option. - newbreedofgeek

1 Answers

0
votes

You have a number of options:

1- As someone else mentioned, there is Cloud Functions. It's basically a Node.js function you deploy and Google Cloud takes care of scaling it up/down for you.

2- You can deploy your Node.js app to App Engine Flex which has autoscaling enabled by default.

3- If you want to stay on Compute Engine, you can manually set autoscaling on Compute Engine.