I'm using the 0.9 Kafka Java client in Scala.
scala> val kafkaProducer = new KafkaProducer[String, String](props)
ProducerRecord has several constructors that allow you to include or not include a key and/or partition.
scala> val keyedRecord = new ProducerRecord("topic", "key", "value")
scala> kafkaProducer.send(keyedRecord)
should have no problem.
However, an unkeyed ProducerRecord gives a type error.
scala> val unkeyedRecord = new ProducerRecord("topic", "value")
res8: org.apache.kafka.clients.producer.ProducerRecord[Nothing,String] =
ProducerRecord(topic=topic, partition=null, key=null, value=value
scala> kafkaProducer.send(res8)
<console>:17: error: type mismatch;
found : org.apache.kafka.clients.producer.ProducerRecord[Nothing,String]
required: org.apache.kafka.clients.producer.ProducerRecord[String,String]
Note: Nothing <: String, but Java-defined class ProducerRecord is invariant in type K.
You may wish to investigate a wildcard type such as `_ <: String`. (SLS 3.2.10)
kafkaProducer.send(res8)
^
Is this against Kafka's rules or could it be an unnecessary precaution that has come from using this Java API in Scala?
More fundamentally, is it poor form to put keyed and unkeyed messages in the same Kafka topic?
Thank you
Javadoc: http://kafka.apache.org/090/javadoc/org/apache/kafka/clients/producer/package-summary.html
Edit
Could changing the variance of parameter K in KafkaProducer fix this?