0
votes

I want to update the spark properties of a currently running spark streaming job.

I have set some properties on SparkConf in the programs and some on spark-defaults.conf.

How do i update them so that my curently running job will pick them?

Is it possible to do like that?

1

1 Answers

2
votes

Usually it is not possible to refresh configuration during runtime. Only certain SQL options can be changed with active context.

Otherwise:

  • modify configuration
  • restart app