You know how in Apache Storm you can have a Spout streaming data to multiple Bolts. Is there a way to do something similar in Apache Spark?
I basically want that there be one program to read data from a Kafka Queue and output it to 2 different programs, which can then process it in their own, different ways.
Specifically, there would be a reader program that would read data from the Kafka queue and output it to 2 programs x and y. x would process the data to calculate metrics of one kind (in my case it would calculate the user activities) whereas y would calculate metrics of another kind (in my case this would be checking activities based on different devices).
Can someone help me understand how this is possible in Spark?