i would appreciate if somebody could help me with a small doubt.
Whats the difference between using socket.io broadcast function and designing the architecture with pub/sub on Redis?.
For instance, on the further example, the node.js server is listening (socket.io) CRUD requests (create) for a "key" (model 'todo'), and value 'data'. The moment it receive it, it emits again to the same user, and broadcast to all users listening on the same "channel".
socket.on('todo:create', function (data, callback) {
var id = guid.gen()
, todo = db.set('/todo/' + id, data)
, json = todo._attributes;
socket.emit('todos:create', json);
socket.broadcast.emit('todos:create', json);
callback(null, json);
});
But there is another way of "broadcasting" something using socket.io, and is using a pub/sub platform with Redis for key:value functions. For instance, on the further case, we are listening for a CRUD request based on a "key" (model), function (create), and value (data). But on this case, once its received, is not sent via "socket.broadcast.emit()", but published on Redis:
socket.on(key + ':create', function (data, callback) {
var t = new ModelClass(data)
, name = '/' + key + ':create';
t.save(function (err) {
pub.publish(key, JSON.stringify({key: name, data: t}));
});
});
So on the server side, every change made on a model (and published to Redis), will be catched up (handler), and sent to the user client side (on my case, backbone.js), that will render its model according to the key:value received:
sio.on('connection', function (socket) {
sub.on('message', function (channel, message) {
var msg = JSON.parse(message);
if (msg && msg.key) {
socket.emit(msg.key, msg.data);
}
});
So my question is very simple :-) : whats the difference of both architectures? which one is more scalable? better design? mode advance?
It looks to me that the pub/sub architecture is suitable for platforms that doesn't support 'realtime' naturally, like Ruby, in contrast to Node.js, that supports it natively.. Am i wrong?