I'm using Beam Java SDK 2.9.0, My job reads from Kafka in the step. My job works just fine on Direct runner. When I deploy it on Dataflow the job is stuck and I don't see any progress. The Dataflow monitoring UI shows
Output collections
EventKafkaReadTransform/Values/Values/Map.out0
Elements added
–
Estimated size
–
The stackdriver logs seems to be going in a loop with the below messages
Error syncing pod 75bf4f18ce7d4d30a2b7de627656b517 ("dataflow-eventingestjob-xxx-0-02062225-wxsc-harness-r3kq_default(75bf4f18ce7d4d30a2b7de627656b517)"), skipping: failed to "StartContainer" for "java-streaming" with CrashLoopBackOff: "Back-off 5m0s restarting failed container=java-streaming pod=dataflow-eventingestjob-xxx-0-02062225-wxsc-harness-r3kq_default(75bf4f18ce7d4d30a2b7de627656b517)
I cannot figure what else to look for.
Any help is appreciated