2
votes

I am submitting a spark job on HDInsight spark cluster. While submitting the job,it throws following error :

17/05/01 13:55:14 WARN Client: Failed to connect to server: hn0-testsp.a0yxittmcfkubfqkfg1ld1vobc.bx.internal.cloudapp.net/172.18.0.28:8050: retries get failed due to exceeded maximum allowed retries number: 0 java.net.ConnectException: Connection refused

Here 172.18.0.28 is the spark headnode IP address.

Following is the command with input arguments:

spark-submit --class com.org.stream.spark.CustomKafkaStreamWriter \
    --master yarn \
    --deploy-mode cluster \
    --driver-memory 4g \
    --executor-memory 2g \
    --executor-cores 1 \
    --queue thequeue \
    target/sampleproject-SNAPSHOT.jar \
    172.18.0.39:2181 172.18.0.35:9092

Can anyone help what can be the issue?

1
Are you submitting this job on the headnode?Thomas Nys

1 Answers

0
votes

As far as I'm aware you're not able to access Spark externally directly.

Instead either log on the ssh node or, which might be the better solution since you're able to submit your job independently of your cluster, submit your job via livy