1
votes

I'm building a Sails app that is using socket.io and see that Sails offers a method for using multiple servers via redis:

http://sailsjs.org/documentation/concepts/realtime/multi-server-environments

Since I will be placing the app on AWS, preferably with ELB (elastic load balancer) and autoscale group with multiple EC2 instances was wondering how I can handle so it doesn't need a separate redis instance?

Maybe we can use AWS Elasticache? If so - how would this be done?

Now that AWS has released the new ALB application load balancer which has websockets, could this be used to help simplify things?

Thanks in advance

Updates for use-cases in application

  1. Allow end-user to update data dynamically from their own dashboard and display analytics/stats in real-time to an administrator

  2. Application status' to change based on specific timings eg. at a given start date/time the app allows users to update data.

1

1 Answers

1
votes

Regarding your first question, you don't want to run Redis on the same servers that Sails is running on, especially if you are using AutoScaling. The Redis server needs to be a separate server that won't disappear if your environment experiences a "scale-in" event. So Redis is going to have to be on a separate "server" somewhere.

ElastiCache is just separate EC2 instances, running Redis, where AWS handles most of the management for you to the point that you can't even SSH into the instance. It's similar to how RDS works. ElastiCache will certainly work for your scenario. You might also want to look at the third-party service RedisLabs which also manages Redis instances on AWS for you.

Regarding your second question, an Application Load Balancer will have no bearing on your Redis usage. It will however bring actual support for WebSockets which it sounds like you are using. So yes, you should be using an ALB instead of an ELB.