I'm just initiating in Mahout and Spark and trying to run the example from mahout's page on this link:
Playing with Mahout's Spark Shell
Everything appears to start but when I try to run the follow command, it returns the error below:
val y = drmData.collect(::, 4)
[Stage 0:> (0 + 0) / 2] 15/09/26 18:38:09 WARN TaskSchedulerImpl: Initial job has not accepted any resources; check your cluster UI to ensure that workers are registered and have sufficient resources
Can anyone help me with this!
My environment is:
export JAVA_HOME=/usr/lib/jvm/java-1.7.0-openjdk-amd64
export MAHOUT_HOME=/home/celso/Downloads/mahout/mahout
export SPARK_HOME=/home/celso/Downloads/spark-1.4.1
export MASTER=spark://celso-VirtualBox:7077
I tried to set MAHOUT_LOCAL to true too.