1
votes

I have a Spark streaming job that receives messages from RabbitMQ and performs some state mapping on them.

At some point in my spark streaming job (about 2000 messages / 500 batches in) the processing just freezes. It will continue receiving messages and creating batches but those will just stay pending.

The executor memory is set so 2G and the driver memory to 1G.

I couldn't see anything in the logs and the executors page shows nothing (before and after).

Why is this happening?

1
looks like the current task is not finishing. You will need to trace execution to figure it out. - maasg
yeah, there actually was 2 jobs currently running when it happened. I guess one of them is just Streaming itself and the other one could be the one thats frozen. I'll need to wait a bit to reproduce it and check it. - Jochen Niebuhr

1 Answers

-1
votes

Seems like it was some kind of OOM killing. After putting some memory limits on the JVM it seems to work.