I have been able to set up spring cloud dataflow 1.2.0.RELEASE in a Kubernetes cluster, import the starter apps, and run simple flows such as "http | log
".
But when I try to run my own Dockerized stream app in a stream like "http | myApp | log
", I can't seem to get it to work... Deployment is fine, Kafka topics for the stream are created as expected, no error messages from any of the pods, but the myApp processor doesn't seem to be aware of the Kafka topics and therefore never receives data from the http source.
My question is this: is there anything special about dockerizing a streaming app for SCDF? How would the dataflow server configure a K8s pod to point it to the correct Kafka topics? I went to https://github.com/spring-cloud-stream-app-starters and can't seem to find any Dockerfile examples, so I don't know how it's done for the starter apps.
My stream application is pretty straightforward; the code is like this:
@SpringBootApplication
@EnableBinding(Processor.class)
public class MyAppProcessor
{
...
@StreamListener(Processor.INPUT)
@SendTo(Processor.OUTPUT)
List<MyPOJO> doProcessing(List<Double> doubles) {
....
}
}
I also used @ServiceActivator
but doesn't seem to make a difference.