0
votes

i'm trying to execute a spark job, using the following spark-submit command-

spark-submit --class package.Classname --queue QueueName --executor-cores 2 --master yarn --deploy-mode cluster --executor-memory 8G --num-executors 20 --driver-memory 10G --conf "spark.yarn.executor.memoryOverhead=3G" --conf "spark.speculation=true" --conf "spark.network.timeout=600" --conf "spark.rpc.askTimeout=600s" --conf "spark.executor.heartbeatInterval=120s" --conf "spark.driver.extraJavaOptions=-Dlog4j.configuration=log4j-spark.properties" --conf "spark.executor.extraJavaOptions=-Dlog4j.configuration=log4j-spark.properties" --conf "spark.serializer=org.apache.spark.serializer.KryoSerializer" --conf "spark.root.logger=ALL,console" --conf "spark.hadoop.validateOutputSpecs=false" --conf "spark.driver.extraClassPath=/home/tumulusr/spark-defaults.conf" --files /etc/spark2/2.6.4.0-91/0/hive-site.xml,config/ibnext.properties,config/hive.properties,config/mongo.properties,config/error.properties /home/tumulusr/pn.jar

the application gets accepted but soon it exits with the following error:

ERROR root: EAP#5: Application configuration file is missing INFO ApplicationMaster: Final app status: FAILED, exitCode: 16, (reason: Shutdown hook called before final status was reported.) INFO ApplicationMaster: Unregistering ApplicationMaster with FAILED (diag message: Shutdown hook called before final status was reported.) INFO ApplicationMaster: Deleting staging directory (directory path with application id) INFO ShutdownHookManager: Shutdown hook called

am i missing anything my spark-submit command?

1

1 Answers

0
votes

Can you try the below :

export SPARK_MAJOR_VERSION=2;export HADOOP_CONF_DIR=**/hadoop/conf/path**; spark-submit