0
votes

I keep getting this error message:

The message is 1169350 bytes when serialized which is larger than the maximum request size you have configured with the max.request.size configuration.

As indicated in other StackOverflow posts, I am trying to set the “max.request.size” configuration in the Producer as follows:

.writeStream
.format("kafka")
.option(
  "kafka.bootstrap.servers",
  conig.outputBootstrapServer
)
.option(ProducerConfig.MAX_REQUEST_SIZE_CONFIG, "10000000")

But this is not working. Am I setting this correctly? Is there a different way to set this property under Spark Structured Streaming?

1

1 Answers

2
votes

If I remember, you have to prefix all kafka properties with "kafka.". Could you try this?

.option(s"kafka.${ProducerConfig.MAX_REQUEST_SIZE_CONFIG}", "10000000")