0
votes

Hi all, I'm using SBT to build my project, and here is the structure of my project.

HiveGenerator
├── build.sbt
├---lib
├── project
│   ├── 
│   ├── assembly.sbt
│   └── plugins.sbt
├── 
├── 
└── src
    └── main
        └── scala
            └── Main.scala  

But i'm facing this error "java.lang.ClassNotFoundException: package.classname", no matter how many times i build it. I have used, sbt clean package sbt clean assembly,but with no luck.My class is always missing from the jar.

Here is my build.sbt

lazy val root = (project in file(".")).
 settings(
 name := "kafkaToMaprfs",
version := "1.0",
scalaVersion := "2.10.5",
mainClass in Compile := Some("classname")
 )
 libraryDependencies ++= Seq(
   "org.apache.spark" % "spark-hive_2.10" % "1.6.1",
  "org.apache.spark" % "spark-core_2.10" % "1.6.1",
  "org.apache.spark" % "spark-sql_2.10" % "1.6.1",
  "com.databricks" % "spark-avro_2.10" % "2.0.1",
  "org.apache.avro" % "avro" % "1.8.1",
  "org.apache.avro" % "avro-mapred" % "1.8.1",
  "org.apache.avro" % "avro-tools" % "1.8.1",
  "org.apache.spark" % "spark-streaming_2.10" % "1.6.1",
  "org.apache.spark" % "spark-streaming-kafka_2.10" % "1.6.1",
  "org.codehaus.jackson" % "jackson-mapper-asl" % "1.9.13",
  "org.openrdf.sesame" % "sesame-rio-api" % "2.7.2",
  "log4j" % "log4j" % "1.2.17",
  "com.twitter" % "bijection-avro_2.10" % "0.7.0"

 )


mergeStrategy in assembly <<= (mergeStrategy in assembly) { (old) =>
   {
    case PathList("META-INF", xs @ _*) => MergeStrategy.discard
    case x => MergeStrategy.first
   }
}

Here is my assembly.sbt

addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.14.3")

plugins.sbt

addSbtPlugin("com.typesafe.sbt" % "sbt-site" % "0.7.0")
resolvers += Resolver.url("bintray-sbt-plugins", url("http://dl.bintray.com/sbt/sbt-plugin-releases"))(Resolver.ivyStylePatterns)
resolvers += "OSS Sonatype" at "https://repo1.maven.org/maven2/"

However, im not able to build a fat jar or you can say jar-with-dependencies.jar like in maven.

In maven we have

<descriptorRefs>
<descriptorRef>jar-with-dependencies</descriptorRef>
</descriptorRefs>

Which helped me to accomplish this.

My question is, *1. why am i not building a jar with all the classes in it?

2.Which commands should i use to create a jar with dependencies in sbt?

3.To we have anything equivalent to "descriptorRefs" in sbt to do the magic?*

Last question , which i didnt find answer to, can't we achieve a proper output with sbt should we always use spark-submit to make it happen(not considering local or cluster modes)?

Thanks in advance.

1
which class is not found ? With sbt assembly you can create a uber jar with dependencies but the mergeStrategy will pick the first of duplicatesmanuzhang
the actual main.scala is the the class im looking for, so when i run the spark-submit command, with the property --class "main"(assuming main as the classname). it is throwing the class not ofund exception, but once i de-compile it the class exists.jack AKA karthik

1 Answers

0
votes

Try deleting your ~/.ivy2/ or moving it out of the way and rebuild, letting everything reload from the net. Of course, you'll have to rebuild all of your local builds that contribute to your assembly as well.

I found your post because I had the same problem, this fixed it. It may not solve your issue, but it does solve some issues of this nature (I've seen it quite a bit).