I have one job that will take a long time to run on DataProc. In the meanwhile I need to be able to run other smaller jobs.
From what I could gather from the Google Dataproc documentation, the platform is supposed to support multiple jobs, since it uses YARN dynamic allocation for resources.
However, when I try to do launch multiple jobs, they get queued and one doesn't start until the cluster is free.
All settings are by default. How can I enable multiple jobs running at the same time?