1
votes

I'm just initiating in Mahout and Spark and trying to run the example from mahout's page on this link:

Playing with Mahout's Spark Shell

Everything appears to start but when I try to run the follow command, it returns the error below:

val y = drmData.collect(::, 4)

[Stage 0:> (0 + 0) / 2] 15/09/26 18:38:09 WARN TaskSchedulerImpl: Initial job has not accepted any resources; check your cluster UI to ensure that workers are registered and have sufficient resources

Can anyone help me with this!

My environment is:

export JAVA_HOME=/usr/lib/jvm/java-1.7.0-openjdk-amd64
export MAHOUT_HOME=/home/celso/Downloads/mahout/mahout
export SPARK_HOME=/home/celso/Downloads/spark-1.4.1
export MASTER=spark://celso-VirtualBox:7077

I tried to set MAHOUT_LOCAL to true too.

2
How did you start your shell? - eliasah
I did follow the tutorial on the link, first started spark with sbin/start-all.sh on SPARK_HOME, then on mahout's home i ran sbin/mahout spark-shell - Celso Marigo Jr

2 Answers

0
votes

The Mahout 0.11.x Spark Shell is not yet compatible with Spark 1.4.1.

The most recent release, Mahout 0.11.0, requires Spark 1.3.x.

Mahout 0.10.2 is compatible with Spark 1.2.x and earlier.

0
votes

I just made the example work.

I just setted the environment variable MASTER to local:

export MASTER=local 

intead of

export MASTER=spark://hadoopvm:7077

The example worked!