I have a spark master running on amazon ec2. I tried to connect to it using pyspark as follows from another ec2 instance as follows:
spark = SparkSession.builder.appName("MyApp") \
.master("spark_url_as_obtained_in_web_ui") \
.getOrCreate()
The following were the errors:
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
2018-04-04 20:03:04 WARN Utils:66 - Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
............
java.net.BindException: Cannot assign requested address: Service 'sparkDriver' failed after 16 retries (on a random free port)! Consider explicitly setting the appropriate binding address for the service 'sparkDriver' (for example spark.driver.bindAddress for SparkDriver) to the correct binding address.
I tried all the solutions as described here but to no avail:
All masters are unresponsive ! ? Spark master is not responding with datastax architecture
spark submit "Service 'Driver' could not bind on port" error
https://community.hortonworks.com/questions/8257/how-can-i-resolve-it.html
What could be going wrong??