0
votes

I'm building an Apache Beam pipeline reading from Kafka using KafkaIO but I'm not sure how to fix the serialization issue.

How KafkaIO is used:

this.pipeline
            .apply("ReadFromKafka",
                    KafkaIO
                            .<byte[], byte[]>read()
                            .withConsumerFactoryFn(input -> {
                                this.updateKafkaConsumerProperties(this.kafkaConsumerConfig, input);
                                return new KafkaConsumer<>(input);
                            })
                            .withBootstrapServers(kafkaConsumerConfig.getBootstrapServer())
                            .withTopic(this.pipelineSourceKafkaConfiguration.getOnboardingTopic())
                            .withKeyDeserializer(ByteArrayDeserializer.class)
                            .withValueDeserializer(ByteArrayDeserializer.class))

            .apply("WindowTheData", Window.into(FixedWindows.of(Duration.standardSeconds(5))))
            ...

But my driver program wasn't able to launch, throwing the following:

java.lang.IllegalArgumentException: unable to serialize org.apache.beam.sdk.io.kafka.KafkaUnboundedSource@65bd19bf
    at org.apache.beam.sdk.util.SerializableUtils.serializeToByteArray(SerializableUtils.java:57)
    at org.apache.beam.sdk.util.SerializableUtils.clone(SerializableUtils.java:107)
    at org.apache.beam.sdk.util.SerializableUtils.ensureSerializable(SerializableUtils.java:86)
    at org.apache.beam.sdk.io.Read$Unbounded.<init>(Read.java:137)
    at org.apache.beam.sdk.io.Read$Unbounded.<init>(Read.java:132)
    at org.apache.beam.sdk.io.Read.from(Read.java:55)
    at org.apache.beam.sdk.io.kafka.KafkaIO$Read.expand(KafkaIO.java:665)
    at org.apache.beam.sdk.io.kafka.KafkaIO$Read.expand(KafkaIO.java:277)
    at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:537)
    at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:491)
    at org.apache.beam.sdk.values.PBegin.apply(PBegin.java:56)
    at org.apache.beam.sdk.Pipeline.apply(Pipeline.java:188)
    at com.company.lib.pipelines.DataPersistencePipeline.execute(DataPersistencePipeline.java:64)
    at com.company.app.MainApp.registerPipelineEndpoints(MainApp.java:102)
    at com.company.app.MainApp.run(MainApp.java:81)
    at com.company.app.MainApp.run(MainApp.java:44)
    at io.dropwizard.cli.EnvironmentCommand.run(EnvironmentCommand.java:43)
    at io.dropwizard.cli.ConfiguredCommand.run(ConfiguredCommand.java:87)
    at io.dropwizard.cli.Cli.run(Cli.java:78)
    at io.dropwizard.Application.run(Application.java:93)
    at com.company.app.MainApp.main(MainApp.java:51)
Caused by: java.io.NotSerializableException: com.company.lib.pipelines.DataPersistencePipeline
    at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1184)
    at java.io.ObjectOutputStream.writeArray(ObjectOutputStream.java:1378)
    at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1174)
    at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1548)
    at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1509)
    at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
    at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
    at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1548)
    at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1509)
    at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
    at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
    at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1548)
    at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1509)
    at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
    at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
    at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:348)
    at org.apache.beam.sdk.util.SerializableUtils.serializeToByteArray(SerializableUtils.java:53)
    ... 20 more

The exception complains about org.apache.beam.sdk.io.kafka.KafkaUnboundedSource object not being serializable.

This class comes from the Apache Beam SDK and it actually implements the Serializable interface. Not sure where I did things wrong.

1

1 Answers

0
votes

It seems that the KafkaIO.Read#withConsumerFactoryFn(org.apache.beam.sdk.transforms.SerializableFunction) method requires its argument to be Serializable.

Since the lambda expression used as the argument references a member variable (this.kafkaConsumerConfig) of the outer class, the outer class (in this case DataPersistencePipeline) also needs to be Serializable.

(It's actually been pointed out by the exception: Caused by: java.io.NotSerializableException: com.company.lib.pipelines.DataPersistencePipeline)