0
votes

I'm building Apache spark source code in ubuntu 14.04.4 (spark version: 1.6.0 with Scala code runner version 2.10.4) with command

sudo sbt/sbt assembly

and getting the following error,

[warn] def deleteRecursively(dir: TachyonFile, client: TachyonFS) {
[warn] ^
[error]
[error] while compiling: /home/ashish/spark-apps/spark-1.6.1/core/src/main/scala/org/apache/spark/util/random/package.scala
[error] during phase: jvm
[error] library version: version 2.10.5
[error] compiler version: version 2.10.5
[error] reconstructed args: -deprecation -Xplugin:/home/ashish/.ivy2/cache/org.spark-project/genjavadoc-plugin_2.10.5/jars/genjavadoc-plugin_2.10.5-0.9-spark0.jar -feature -P:genjavadoc:out=/home/ashish/spark-apps/spark-1.6.1/core/target/java -classpath /home/ashish/spark-apps/spark-1.6.1/core/target/scala-2.10/classes:/home/ashish/spark-apps/spark-1.6.1/launcher/target/scala-2.10/classes:/home/ashish/spark-apps/spark-1.6.1/network/common/target/scala-2.10/classes:/home/ashish/spark-apps/spark-1.6.1/network/shuffle/target/scala-2.10/classes:/home/ashish/spark-apps/spark-1.6.1/unsafe/target/scala-2.10/classes:/home/ashish/.ivy2/cache/org.spark-project.spark/unused/jars/unused-1.0.0.jar:/home/ashish/.ivy2/cache/com.google.guava/guava/bundles/guava-14.0.1.jar:/home/ashish/.ivy2/cache/io.netty/netty-all/jars/netty-all-4.0.29.Final.jar:/home/ashish/.ivy2/cache/org.fusesource.leveldbjni/leveldbjni-all/bundles/leveldbjni-all-1.8.jar:/home/ashish/.ivy2/cache/com.fasterxml.jackson.core/jackson-databind/bundles/jackson-databind-2.4.4.jar:/home/ashish/.ivy2/cache/com.fasterxml.jackson.core/jackson-annotations/bundles/jackson-annotations-2.4.4.jar:/home/ashish/.ivy2/cache/com.fasterxml.jackson.core/jackson-core/bundles/jackson-......and many other jars...



[error]
[error] last tree to typer: Literal(Constant(collection.mutable.Map))
[error]
symbol: null
[error] symbol definition: null
[error]
tpe: Class(classOf[scala.collection.mutable.Map])
[error]
symbol owners:
[error] context owners: package package -> package random
[error]
[error] == Enclosing template or block ==
[error]
[error] Template( // val : in package random, tree.tpe=org.apache.spark.util.random.package.type
[error]
"java.lang.Object" // parents
[error] ValDef(
[error]
private
[error] "_"
[error]
[error]

[error] )
[error] DefDef( // def (): org.apache.spark.util.random.package.type in package random
[error]
[error] ""
[error]
[]
[error] List(Nil)
[error] // tree.tpe=org.apache.spark.util.random.package.type
[error]
Block( // tree.tpe=Unit
[error] Apply( // def (): Object in class Object, tree.tpe=Object
[error]
package.super."" // def (): Object in class Object, tree.tpe=()Object
[error] Nil
[error] )
[error] ()
[error] )
[error] )
[error] )
[error]
[error] == Expanded type of tree ==
[error]
[error] ConstantType(value = Constant(collection.mutable.Map))
[error]
[error] uncaught exception during compilation: java.io.IOException
[error] File name too long
[warn] 45 warnings found
[error] two errors found
[error] (core/compile:compile) Compilation failed
[error] Total time: 5598 s, completed 5 Apr, 2016 9:06:50 AM



Where I'm getting wrong?

2

2 Answers

0
votes

You should build Spark with Maven...

download the source and run ./bin/mvn clean package