In Kafka Streams, whats the canonical way of producing/writing a stream? In Spark, there is the custom receiver which works as a long running adapter from an arbitrary data source. What is the equivalent in Kafka Streams?
To be specific, I'm not asking how to do transforms from one topic to another. The documentation is very clear on that. I want to understand how to write my workers that will be doing the first write in a series of transforms into Kafka.
I expect to be able to do
builder1.<something>(<some intake worker like a spark reciver)
.to(topic1)
.start()
builder2.from(topic1)
.transform(<some transformation function>)
.to(topic2)
.start()
But none of the existing documentation shows this? Am I missing something?