0
votes

I have multiple jobs in the same Flink project. I wondering that when i submit job to the Flink cluster, i do not know whic job is running for this submitted jar. For example;

i have 2 jobs in the same flink project, and create jar then deploy it to the cluster. Then I submit two jobs using the same jar.

Then i change the job2, and create new jar file. Then submit job2 with new uploaded jar.

In this case, when o look the submit UI, i really don't know which job was submitted the from this jar.

To prevent it, I can create multiple flink projects with different jar name.

Note: I'm using the CI/CD pipeline, and i can not create jar name dynamically if i have multiple jobs in the same project.

What is the best practices for that?

1

1 Answers

0
votes

You can specify job name while executing the flink run command by passing argument to your main class and then using that value in env.execute(params.get('your-job-name')). In this way you can use same jar and pass different job names so that you can distinguish it flink dashboard after deployment.