1
votes

I am a newbie trying to profile a local spark job. Here is the command that I am trying to execute, but I am getting a warning stating my executor options are being ignored since they are non-spark config properties.

error:

Warning: Ignoring non-spark config property: “spark.executor.extraJavaOptions=javaagent:statsd-jvm-profiler-2.1.0-jar-with-dependencies.jar=server=localhost,port=8086,reporter=InfluxDBReporter,database=profiler,username=profiler,password=profiler,prefix=MyNamespace.MySparkApplication,tagMapping=namespace.application”

Command:

./bin/spark-submit --master local[2] --class org.apache.spark.examples.GroupByTest --conf “spark.executor.extraJavaOptions=-javaagent:statsd-jvm-profiler-2.1.0-jar-with-dependencies.jar=server=localhost,port=8086,reporter=InfluxDBReporter,database=profiler,username=profiler,password=profiler,prefix=MyNamespace.MySparkApplication,tagMapping=namespace.application” --name HdfsWordCount --jars /Users/shprin/statD/statsd-jvm-profiler-2.1.0-jar-with-dependencies.jar libexec/examples/jars/spark-examples_2.11-2.3.0.jar

Spark version : 2.0.3

Please let me know, how to solve this.

Thanks in Advance.

2
Can you please remove the quotes around spark.executor.extraJavaOptions=... and start over? I'd rather do --conf spark.executor.extraJavaOptions="-javaagent:statsd-jvm-profiler-2.1.0-jar-with-dependencies.jar=server=localhost,port=8086,reporter=InfluxDBReporter,database=profiler,username=profiler,password=profiler,prefix=MyNamespace.MySparkApplication,tagMapping=namespace.application" or even without the quotes.Jacek Laskowski
Tried that didn't work1LearningNinja
What was the error?Jacek Laskowski

2 Answers

1
votes

Apart from answers above, if your parameter contains both spaces and single quotes (for instance a query paramter) you should enclose it with in escaped double quote \"

Example:

spark-submit --master yarn --deploy-mode cluster --conf "spark.driver.extraJavaOptions=-DfileFormat=PARQUET -Dquery=\"select * from bucket where code in ('A')\" -Dchunk=yes" spark-app.jar
0
votes

I think the problem is the double quote you are using to specify the spark.executor.extraJavaOptions. It should have been a single quote.

./bin/spark-submit --master local[2] --conf 'spark.executor.extraJavaOptions=-javaagent:statsd-jvm-profiler-2.1.0-jar-with-dependencies.jar=server=localhost,port=8086,reporter=InfluxDBReporter,database=profiler,username=profiler,password=profiler,prefix=MyNamespace.MySparkApplication,tagMapping=namespace.application' --class org.apache.spark.examples.GroupByTest --name HdfsWordCount --jars /Users/shprin/statD/statsd-jvm-profiler-2.1.0-jar-with-dependencies.jar libexec/examples/jars/spark-examples_2.11-2.3.0.jar