2
votes

When making the call

val kafkaParams: Map[String, String] =...
var topic: String = ..
val input2 = KafkaUtils.createDirectStream[String, String, StringDecoder, StringDecoder](ssc, kafkaParams, topic.toSet)

I get the error:

overloaded method value createDirectStream with alternatives: (jssc: org.apache.spark.streaming.api.java.JavaStreamingContext,keyClass: Class[String],valueClass: Class[String],keyDecoderClass: Class[kafka.serializer.StringDecoder],valueDecoderClass: Class[kafka.serializer.StringDecoder],kafkaParams: java.util.Map[String,String],topics: java.util.Set[String])org.apache.spark.streaming.api.java.JavaPairInputDStream[String,String] (ssc: org.apache.spark.streaming.StreamingContext,kafkaParams: scala.collection.immutable.Map[String,String],topics: scala.collection.immutable.Set[String])(implicit evidence$19: scala.reflect.ClassTag[String], implicit evidence$20: scala.reflect.ClassTag[String], implicit evidence$21: scala.reflect.ClassTag[kafka.serializer.StringDecoder], implicit evidence$22: scala.reflect.ClassTag[kafka.serializer.StringDecoder])org.apache.spark.streaming.dstream.InputDStream[(String, String)] cannot be applied to (org.apache.spark.streaming.StreamingContext, scala.collection.immutable.Map[String,String], scala.collection.immutable.Set[Char])

I also get a similar error when calling the parameterised version of createStream.

Any idea what's the issue?

1

1 Answers

1
votes

That is a long message to say that topics needs to be Set[String], not Set[Char].

The best way I can see to fix this is to do:

topic.map(_.toString).toSet

But, if you truly only have one topic, then just do Set(topic) as the above splits the string into a set of single characters.