3
votes

I've been learning play, and I'm getting most of the major concepts, but I'm struggling with what magic the platform is doing to enable all of these things.

In particular, let's say I have a controller that does something time-intensive. Now I understand how using Futures and asynchronous processing I can make these things appear not to block, but if it's something resource intensive, of course in the end it must block somewhere. Per the documentation:

You can’t magically turn synchronous IO into asynchronous by wrapping it in a Future. If you can’t change the application’s architecture to avoid blocking operations, at some point that operation will have to be executed, and that thread is going to block. So in addition to enclosing the operation in a Future, it’s necessary to configure it to run in a separate execution context that has been configured with enough threads to deal with the expected concurrency.

This bit I'm not understanding: if some task that I'm doing via a Future is possibly being handled in a separate thread pool, how/what magic is Scala/Play doing in the framework to coordinate these threads such that whichever thread is listening to the HTTP socket blocks long enough to do all of the complex processing (DB loads, serialization to JSON, etc. etc.) -- in separate threads, and yet somehow returning to the original blocking thread that has to send something back to the client for that request?

1
@KevinMeredith that post is interesting from the perspective of describing why asynchronous processing is desirable; I'm already sold on that point. I'm trying to understand how it actually works, such that a thread that isn't blocked on doing some heavy IO can re-enter to some EC that is blocked (need to generate a response before sending it back) and then do the right thing.FrobberOfBits

1 Answers

2
votes

Disclaimer: this is a simplified answer for the general problem, I don't want to make this even more complex by going inside Play and Akka internals.

One method is to have a thread listening to the socket, but not writing to it, let's call it A. A spans a Future that contains, on itself, all the data needed for the computation. It is important that you don't confuse the thread that does the processing with the data that is being processed, as the data (memory) is shared by all threads (and sometimes needs explicit synchronization). The future will be processed (eventually), by a thread B.

Now, do I need for A to block until B is done? It could (and in many general cases that might be the right solution), but in this case, we hardly want to stop listening to our socket. So no, we don't, A forgets everything about the message and carries on with its life.

So when B is done, the Future might be mapped or have a listener that sends the proper response. B itself can send it given the information that it has on the original message! You just need to be careful synchronizing access to the socket, to avoid colliding with a possible thread C that might have been processing a previous or later message in parallel.

Things can obviously get more complex by having threads spawning even more threads, queues where some threads write data and other read data, etc. (Play, being based in Akka, certainly includes a lot of message queues). But I hope to have convinced you that while this statement is correct:

You can’t magically turn synchronous IO into asynchronous by wrapping it in a Future. If you can’t change the application’s architecture to avoid blocking operations

Such a change in application's architecture is certainly possible in many (most?) cases, and certainly has been done inside Play.