I am using Redis for my rails project to subscribe to channels and publishing to those channels when an event occurs. On the client side, I am registering to EventSource that correspond to these channels. Whenever an event occurs for the subscribed channel at the server, the server does a SSE write so that all registered clients receive the update.
Now the connection with the server stays alive for each client that is subscribed to those channels i.e. server-thread dedicated to this client keeps running until the client disconnects. With this approach, if there are 1000 concurrent users subscribed to a channel, I'd have 1000 TCP/IP connections open.
I am using Puma as the web server as suggested in this tutorial. Puma by default specifies 16 max threads. I can change this limit to a higher limit.
I might not know how many concurrent users there might be at a time in my app and do not know what max no. of threads I can specify in Puma. In a worst case scenario, if the count of threads dedicated to each concurrent user reaches the max count of the threads specified for the Puma webserver, the app would freeze for all users until one of the concurrent user disconnects.
I was excited to use Rails live streaming, and server sent events in my rails project but with this approach I risk to reach the limit of max threads specified in my web server and consequently app getting unresponsive for all users till one of the concurrent user disconnects.
Not sure what is the typical max thread count for Puma for a large concurrent user-base.
Should I consider other approaches - perhaps ajax-based polling or Node.js that uses an event-driven, non-blocking I/O model? Or just run some benchmarks to know what my max thread count can be?