I am trying to read kafka topic with new directStream method in KafkaUtils. I have Kafka topic with 8 partitions. I am running streaming job on yarn with 8 execuors with 1 core(--num-executors 8 --executor-cores 1) for each one. So noticed that spark reads all topic's partitions in one executor sequentially - this is obviously not what I want. I want spark to read all partitions in parallel. How can I achieve that?
Thank you, in advance.