I have just started using Spark Apache. I tested it in my local computer (Windows 10, Intel Core i5, 8 GB RAM) and everything worked correctly. When I tried to start a cluster manually I got a error as in the below attached image:
Info from log (C:\Spark\logs\spark--org.apache.spark.deploy.master.Master-1-XXXXXX.out:
Spark Command: C:\Program Files\Java\jdk1.8.0_72\bin\java -cp C:\Spark/conf\; C:\Spark/lib/spark-assembly-1.6.0-hadoop2.6.0.jar;C:\Spark\lib\datanucleus-api-jdo-3.2.6.jar; C:\Spark\lib\datanucleus-core-3.2.10.jar;C:\Spark\lib\datanucleus-rdbms-3.2.9.jar -Xms1g -Xmx1g org.apache.spark.deploy.master.Master --ip XXXXXX --port 7077 --webui-port 8080
I use these sources to resolve this issue but no success:
Spark Standalone Mode
How to Setup Local Standalone Spark Node
Setup a Apache Spark cluster in your single standalone machine
Thank you for any feedback.