8
votes

I am looking for a way to easily reuse akka-stream flows.

I treat the Flow I intend to reuse as a function, so I would like to keep its signature like:

Flow[Input, Output, NotUsed]

Now when I use this flow I would like to be able to 'call' this flow and keep the result aside for further processing.

So I want to start with Flow emiting [Input], apply my flow, and proceed with Flow emitting [(Input, Output)].

example:

val s: Source[Int, NotUsed] = Source(1 to 10)

val stringIfEven = Flow[Int].filter(_ % 2 == 0).map(_.toString)

val via: Source[(Int, String), NotUsed] = ???

Now this is not possible in a straightforward way because combining flow with .via() would give me Flow emitting just [Output]

val via: Source[String, NotUsed] = s.via(stringIfEven)

Alternative is to make my reusable flow emit [(Input, Output)] but this requires every flow to push its input through all the stages and make my code look bad.

So I came up with a combiner like this:

def tupledFlow[In,Out](flow: Flow[In, Out, _]):Flow[In, (In,Out), NotUsed] = {
  Flow.fromGraph(GraphDSL.create() { implicit b =>
  import GraphDSL.Implicits._

  val broadcast = b.add(Broadcast[In](2))
  val zip = b.add(Zip[In, Out])

  broadcast.out(0) ~> zip.in0
  broadcast.out(1) ~> flow ~> zip.in1

  FlowShape(broadcast.in, zip.out)
})

}

that is broadcasting the input to the flow and as well in a parallel line directly -> both to the 'Zip' stage where I join values into a tuple. It then can be elegantly applied:

val tupled: Source[(Int, String), NotUsed] = s.via(tupledFlow(stringIfEven))

Everything great but when given flow is doing a 'filter' operation - this combiner is stuck and stops processing further events.

I guess that is due to 'Zip' behaviour that requires all subflows to do the same - in my case one branch is passing given object directly so another subflow cannot ignore this element with. filter(), and since it does - the flow stops because Zip is waiting for push.

Is there a better way to achieve flow composition? Is there anything I can do in my tupledFlow to get desired behaviour when 'flow' ignores elements with 'filter' ?

2
The main problem of the concept here is that Flow[T, U, ...] is not a function. For each input element it may return 0, 1, or more output elements. It may even keep back input elements and use them only later when more data is available. For this reason it is impossible to provide this feature generically if the wrapped flow doesn't support it itself. It can work generically, if it is strictly enforced that the wrapped Flow is a one-to-one flow that actually works like a function (but filter doesn't work then). Usually, using mapAsync in such cases is a simpler way.jrudolph
yes, you're right. Problem would happen if my reusable flow would return N elements. Stating the assumption that the wrapped Flow may output 0 or 1 element for every 1 input element would allow to write a different semantic Zip operator that would zip with input only if wrapped Flow outputs and skip everything if the wrapped Flow is not pushing any element.Tomasz Bartczak
Even that would be hard to do because the pulling and pushing of the wrapped flow does not happen synchronously. You cannot check if "the wrapped `Flow is not pushing any element" - it could just be slow or buffered, etc.jrudolph

2 Answers

3
votes

Two possible approaches - with debatable elegance - are:

1) avoid using filtering stages, mutating your filter into a Flow[Int, Option[Int], NotUsed]. This way you can apply your zipping wrapper around your whole graph, as was your original plan. However, the code looks more tainted, and there is added overhead by passing around Nones.

val stringIfEvenOrNone = Flow[Int].map{
  case x if x % 2 == 0 => Some(x.toString)
  case _ => None
}

val tupled: Source[(Int, String), NotUsed] = s.via(tupledFlow(stringIfEvenOrNone)).collect{
  case (num, Some(str)) => (num,str)
}

2) separate the filtering and transforming stages, and apply the filtering ones before your zipping wrapper. Probably a more lightweight and better compromise.

val filterEven = Flow[Int].filter(_ % 2 == 0)

val toString = Flow[Int].map(_.toString)

val tupled: Source[(Int, String), NotUsed] = s.via(filterEven).via(tupledFlow(toString))

EDIT

3) Posting another solution here for clarity, as per the discussions in the comments.

This flow wrapper allows to emit each element from a given flow, paired with the original input element that generated it. It works for any kind of inner flow (emitting 0, 1 or more elements for each input).

  def tupledFlow[In,Out](flow: Flow[In, Out, _]): Flow[In, (In,Out), NotUsed] =
    Flow[In].flatMapConcat(in => Source.single(in).via(flow).map( out => in -> out))
1
votes

I came up with an implementation of TupledFlow that works when wrapped Flow uses filter() or mapAsync() and when wrapped Flow emits 0,1 or N elements for every input:

   def tupledFlow[In,Out](flow: Flow[In, Out, _])(implicit materializer: Materializer, executionContext: ExecutionContext):Flow[In, (In,Out), NotUsed] = {
  val v:Flow[In, Seq[(In, Out)], NotUsed]  = Flow[In].mapAsync(4) { in: In =>
    val outFuture: Future[Seq[Out]] = Source.single(in).via(flow).runWith(Sink.seq)
    val bothFuture: Future[Seq[(In,Out)]] = outFuture.map( seqOfOut => seqOfOut.map((in,_)) )
    bothFuture
  }
  val onlyDefined: Flow[In, (In, Out), NotUsed] = v.mapConcat[(In, Out)](seq => seq.to[scala.collection.immutable.Iterable])
  onlyDefined
}

the only drawback I see here is that I am instantiating and materializing a flow for a single entity - just to get a notion of 'calling a flow as a function'.

I didn't do any performance tests on that - however since heavy-lifting is done in a wrapped Flow which is executed in a future - I believe this will be ok.

This implementation passes all the tests from https://gist.github.com/kretes/8d5f2925de55b2a274148b69f79e55ac#file-tupledflowspec-scala