1
votes

Edit OK I have a long polling from javascript that talks to a django view. The view looks as follows. It loses some messages that I publish from redis client in the channel. Also I should not be connecting to redis for every request (Perhaps the redis variables can be saved in session?) If someone can point out the changes I need to make this view work with long polling, it would be awesome! Thank you!

    def listen (request):
        if request.session:
            logger.info( 'request session: %s' %(request.session))
        channel = request.GET.get('channel', None)
        if channel:
            logger.info('not in cache - first time - constructing redis object')
            r = redis.Redis(host='localhost', port=6379, db=0)
            p = r.pubsub()
            logger.info('subscribing to channel: %s' %(channel))
            p.psubscribe(channel)
            logger.info('subscribed to channel: %s' %(channel))
            message =  p.listen().next()
            logger.info('got msg %s' %(message))

            return HttpResponse(json.dumps(message));

        return HttpResponse('')

----Original question--- I am trying to create a chat application (using django, python) and am trying to avoid the polling mechanism. I have been struggling with this now - so any pointers would be really appreciated!

Since web sockets are not supported in most browsers, I think long polling is the right choice. Right now I am looking for something that scales better than regular polling and is easy to integrate with python django stack. Once I am done with this development, I plan to evaluate other python frameworks (tornado twister, gevent etc.) come to mind.

I did some research and liked the redis pubsub mechanism. The chat message gets published to a channel to which both users have already subscribed to. Following are my questions:

  1. From what I understand, apache would not scale well since long polling would soon run into process/thread limits. Hence I have decided to switch to nginx. Is this rationale correct? Also are there any issues involved in nginx that I am worried about? In particular, I am worried about the latest version not supporting http 1.1 for proxy passing as mentioned in the blog post at http://www.letseehere.com/reverse-proxy-web-sockets?

  2. How do I create the client portion of the subscription of messages on the browser side? In my mind, it would be a url to which the javascript code would "long poll". So at the javascript level, the client would poll a url which gets "blocked" in a "non blocking way" at the server side. When a result (in this case a new chat message) appears, server returns the result. Javascript does what it needs to and then again polls the same url. Is this thinking correct? What happens in between the intervals when the javascript loop is pausing - do we loose any messages from the server side.

In essence, I want to create the following:

  1. From redis, I publish a message to a channel "foo" (can use redis-cli also - easy to incorporate it later in python/django)

  2. I want the same message to appear in two browser windows that use the same js code to poll. Assume that the browser code knows the channel name for test purpose

  3. I publish a second message that again appears in two browser windows.

I am new to real time apps, so apologies for any question that may not make sense.

Thank you!

1

1 Answers

0
votes

Well just answering your question partly and mentioning one option out of many: Gunicorn being used with an async worker class is a solution for long-polling/non-blocking requests that is really easy to setup!