1
votes

I'm trying to install Apache Spark standalone on Ubuntu and while running "sbt/sbt assembly" command, I get this error:

java.lang.RuntimeException: Could not create directory /opt/spark-1.5.1/external/zeromq/target/streams/compile/$global/$global/discoveredMainClasses
        at scala.sys.package$.error(package.scala:27)
        at sbt.IO$.createDirectory(IO.scala:166)
        at sbt.IO$.touch(IO.scala:142)
        at sbt.std.Streams$$anon$3$$anon$2.make(Streams.scala:129)
        at sbt.std.Streams$$anon$3$$anon$2.binary(Streams.scala:116)
        at sbt.SessionVar$$anonfun$persist$1.apply(SessionVar.scala:27)
        at sbt.SessionVar$$anonfun$persist$1.apply(SessionVar.scala:26)
        at sbt.std.Streams$class.use(Streams.scala:75)
        at sbt.std.Streams$$anon$3.use(Streams.scala:100)
        at sbt.SessionVar$.persist(SessionVar.scala:26)
        at sbt.SessionVar$.persistAndSet(SessionVar.scala:21)
        at sbt.Project$RichTaskSessionVar$$anonfun$storeAs$1$$anonfun$apply$5.apply(Project.scala:556)
        at sbt.Project$RichTaskSessionVar$$anonfun$storeAs$1$$anonfun$apply$5.apply(Project.scala:556)
        at sbt.SessionVar$$anonfun$1$$anonfun$apply$1.apply(SessionVar.scala:40)
        at sbt.SessionVar$$anonfun$1$$anonfun$apply$1.apply(SessionVar.scala:40)
        at scala.Function$$anonfun$chain$1$$anonfun$apply$1.apply(Function.scala:24)
        at scala.Function$$anonfun$chain$1$$anonfun$apply$1.apply(Function.scala:24)
        at scala.collection.IndexedSeqOptimized$class.foldl(IndexedSeqOptimized.scala:51)
        at scala.collection.IndexedSeqOptimized$class.foldLeft(IndexedSeqOptimized.scala:60)
        at scala.collection.mutable.ArrayBuffer.foldLeft(ArrayBuffer.scala:47)
        at scala.collection.TraversableOnce$class.$div$colon(TraversableOnce.scala:138)
        at scala.collection.AbstractTraversable.$div$colon(Traversable.scala:105)
        at scala.Function$$anonfun$chain$1.apply(Function.scala:24)
        at sbt.EvaluateTask$.applyResults(EvaluateTask.scala:370)
        at sbt.EvaluateTask$.liftedTree1$1(EvaluateTask.scala:344)
        at sbt.EvaluateTask$.run$1(EvaluateTask.scala:341)
        at sbt.EvaluateTask$.runTask(EvaluateTask.scala:361)
        at sbt.Aggregation$$anonfun$3.apply(Aggregation.scala:64)
        at sbt.Aggregation$$anonfun$3.apply(Aggregation.scala:62)
        at sbt.EvaluateTask$.withStreams(EvaluateTask.scala:293)
        at sbt.Aggregation$.timedRun(Aggregation.scala:62)
        at sbt.Aggregation$.runTasks(Aggregation.scala:71)
        at sbt.Aggregation$$anonfun$applyTasks$1.apply(Aggregation.scala:32)
        at sbt.Aggregation$$anonfun$applyTasks$1.apply(Aggregation.scala:31)
        at sbt.Command$$anonfun$applyEffect$2$$anonfun$apply$3.apply(Command.scala:60)
        at sbt.Command$$anonfun$applyEffect$2$$anonfun$apply$3.apply(Command.scala:60)
        at sbt.Aggregation$$anonfun$evaluatingParser$4$$anonfun$apply$5.apply(Aggregation.scala:153)
        at sbt.Aggregation$$anonfun$evaluatingParser$4$$anonfun$apply$5.apply(Aggregation.scala:152)
        at sbt.Act$$anonfun$sbt$Act$$actParser0$1$$anonfun$sbt$Act$$anonfun$$evaluate$1$1$$anonfun$apply$10.apply(Act.scala:244)
        at sbt.Act$$anonfun$sbt$Act$$actParser0$1$$anonfun$sbt$Act$$anonfun$$evaluate$1$1$$anonfun$apply$10.apply(Act.scala:241)
        at sbt.Command$.process(Command.scala:92)
        at sbt.MainLoop$$anonfun$1$$anonfun$apply$1.apply(MainLoop.scala:98)
        at sbt.MainLoop$$anonfun$1$$anonfun$apply$1.apply(MainLoop.scala:98)
        at sbt.State$$anon$1.process(State.scala:184)
        at sbt.MainLoop$$anonfun$1.apply(MainLoop.scala:98)
        at sbt.MainLoop$$anonfun$1.apply(MainLoop.scala:98)
        at sbt.ErrorHandling$.wideConvert(ErrorHandling.scala:17)
        at sbt.MainLoop$.next(MainLoop.scala:98)
        at sbt.MainLoop$.run(MainLoop.scala:91)
        at sbt.MainLoop$$anonfun$runWithNewLog$1.apply(MainLoop.scala:70)
        at sbt.MainLoop$$anonfun$runWithNewLog$1.apply(MainLoop.scala:65)
        at sbt.Using.apply(Using.scala:24)
        at sbt.MainLoop$.runWithNewLog(MainLoop.scala:65)
        at sbt.MainLoop$.runAndClearLast(MainLoop.scala:48)
        at sbt.MainLoop$.runLoggedLoop(MainLoop.scala:32)
        at sbt.MainLoop$.runLogged(MainLoop.scala:24)
        at sbt.StandardMain$.runManaged(Main.scala:53)
        at sbt.xMain.run(Main.scala:28)
        at xsbt.boot.Launch$$anonfun$run$1.apply(Launch.scala:109)
        at xsbt.boot.Launch$.withContextLoader(Launch.scala:128)
        at xsbt.boot.Launch$.run(Launch.scala:109)
        at xsbt.boot.Launch$$anonfun$apply$1.apply(Launch.scala:35)
        at xsbt.boot.Launch$.launch(Launch.scala:117)
        at xsbt.boot.Launch$.apply(Launch.scala:18)
        at xsbt.boot.Boot$.runImpl(Boot.scala:41)
        at xsbt.boot.Boot$.main(Boot.scala:17)
        at xsbt.boot.Boot.main(Boot.scala)
[error] Could not create directory /opt/spark-1.5.1/external/zeromq/target/streams/compile/$global/$global/discoveredMainClasses
[error] Use 'last' for the full log.

Has anyone else faced this problem?

java version "1.8.0_65"

Scala code runner version 2.11.7 -- Copyright 2002-2013, LAMP/EPFL

3
Typically /opt belongs to root user. Most likely user you use here has no sufficient privileges to write there. - zero323
Fotgot to mention that "sbt/sbt assembly" was executed under root user: root@server:/opt/spark-1.5.1# sbt/sbt assembly - Tomas

3 Answers

2
votes

As the error mentions you do not have write access to the /opt directory.

Could not create directory /opt/spark-1.5.1/external/zeromq/target/streams/compile/$global/$global/discoveredMainClasses

You need root access in order to write to this folder. You can either

  • Download and compile Apache Spark to your home folder and then move it to /opt
  • Run sudo sbt/sbt assembly to gain root access while building spark (it is considered unsafe compiling as root)
0
votes

You must have root privileges to add and manipulate /opt/ files. Spark configuration is wrong. I recommend follow these steps to install spark and scala, then try to run sbt. all the best. https://www.youtube.com/watch?v=BozSL9ygUto

0
votes

We get this error inconsistently. It may be caused by an internal SBT bug.

"It seems like there is a race condition in SBT which is triggered only by plugins that cause multiple compilation processes to run in parallel."

See here for more info: https://github.com/sbt/sbt/issues/1673

See if you can disable some plugins and re-run.