1
votes

I've created a Spring Cloud Dataflow that uses the router sink app from the Spring Cloud Stream starter apps. I'm using the one with RabbitMQ bindings. That automatically creates RabbitMQ Exchanges (type: topic) with the results of my router expressions. I think the next step is to create new Dataflows for each of those router results. However, the rabbit source starter app can only be configured to read from a queue. Of course, I can manually create queues and bind them to the automatically created exchanges, but is that what I'm supposed to do? Or is there some configuration I'm missing that would cause the queues to be automatically created and bound?

2

2 Answers

2
votes

The Rabbit source app is intended to consume from existing infrastructure - the queue has to exist already (similar to the sink).

To use data flow to consume from a dynamically created destination, you can use Named Destinations.

stream create fromDynDest --definition=":myRoutedDest > process1 | process2 | sink"

or even

stream create fromDynDest --definition=":myRoutedDest > sink"
1
votes

Every time you create a source app that is a listener to a destination, that destination will be auto-created for you by default using the binder's destination provisioner. So essentially you create consumer apps before the producer apps to ensure that those destinations exist prior to a message being sent to such destination. And in data-flow that is done for you automatically - "right-to-left" startup order of the apps to ensure consumers are started before producers.

Is that what you're asking?