2
votes

I am trying trying to install spark so that I can use spark in my ipython notebook by following the guide here: https://beingzy.github.io/tutorial/2014/10/13/spark-tutorial-Part-I-setting-up-spark-and-ipython-notebook-within-10-minutes.html

So I download the spark, newest version 1.5.2 and run the "build spark" part. Here is the error msg i got:

C:\Users\Administrator\Downloads\spark-1.5.2\sbt>sbt assembly
[info] Set current project to sbt (in build file:/C:/Users/Administrator/Downloa
ds/spark-1.4.0/sbt/)
[error] Not a valid command: assembly
[error] Not a valid project ID: assembly
[error] Expected ':' (if selecting a configuration)
[error] Not a valid key: assembly
[error] assembly
[error]         ^

I came across a similar question on stakoverflow. However, the original question is on Ubuntu and mine is on windows. Plus,I don't have a folder spark-0.8.1-incubating as suggested in the answer, so the solution cannot work for me.

"./sbt/sbt assembly" errors "Not a valid command: assembly" for Apache Spark project

Pls help!

1
It's the exact same problem as in the linked question. - Reactormonk
@Reactomonk but I don't have a folder call spark-0.8.1-incubating. and I am on windows. So maybe it's the same sort of issue, but I don't know what can I do to solve it. Pls kindly share if you know. - Jasmine
you have one called spark-1.4.0 (you sure you got 1.5.2?) - Reactormonk
oh. sorry, actually i tried both. all the same result. - Jasmine

1 Answers

0
votes

The reason is that you're executing sbt assembly inside the sbt directory that is not Spark's sbt build home directory (that's in your case is C:\Users\Administrator\Downloads\spark-1.5.2).

Go one directory up, i.e. C:\Users\Administrator\Downloads\spark-1.5.2, and execute ./sbt/sbt assembly.