I'm working with Zeppelin (0.7.1) on Spark (2.1.1) on my localhost, and trying to add some configuration values to the jobs I run.
Specifically, I'm trying to set the es.nodes value for elasticsearch-hadoop.
I tried adding the key and value to the interpreter configuration, but that didn't show up in sc.getConf. Adding to the interpreter's "args" configuration key the value of "--conf mykey:myvalue" didn't register as well. Is that not what the spark interpreter configuration is supposed to do?
sc.setConf
? elastic.co/guide/en/elasticsearch/hadoop/current/… – 1ambda