1
votes

I am trying to run a simple command sbt package but its get failed due to following error given below(bold command below). I will be thankful if anyone solve my problem? My spark version is 2.0 and scala version is 2.11.8 and using jdk 1.7 on centOS cloudera

[root@hadoop first]# vim build.sbt 
    name := "First Spark"
    version := "1.0"
    organization := "in.goai"
    scalaVersion := "2.11.8"
    libraryDependencies += "org.apache.spark" %% "spark-core" % "1.6.1"
    resolvers += Resolver.mavenLocal


[root@hadoop first]# ls
    build.sbt  project  src


**[root@hadoop first]# sbt package**
[info] Loading project definition from /home/training/Documents/workspace_scala/first/project
[info] Updating {file:/home/training/Documents/workspace_scala/first/project/}first-build...
[info] Resolving org.fusesource.jansi#jansi;1.4 ...
[info] Done updating.
/home/training/Documents/workspace_scala/first/build.sbt:2: error: eof expected but ';' found.
version := "1.0"
^
[error] Error parsing expression.  Ensure that settings are separated by blank lines.
Project loading failed: (r)etry, (q)uit, (l)ast, or (i)gnore? q
1

1 Answers

1
votes

Just enter blank spaces between each line in your debt file. Hope that helps.

name := "First Spark"

version := "1.0"

organization := "in.goai"

scalaVersion := "2.11.8"

libraryDependencies += "org.apache.spark" %% "spark-core" % "1.6.1"

resolvers += Resolver.mavenLocal