I am trying to compile and run WordCOunt program for Scala using command line without any Maven and sbt support. The command that I am using to compile the scala program is
scalac -classpath /spark-2.3.0-bin-hadoop2.7/jars/ Wordcount.scala
import org.apache.spark._
import org.apache.spark.SparkConf
/** Create a RDD of lines from a text file, and keep count of
* how often each word appears.
*/
object wordcount {
def main(args: Array[String]) {
// Set up a SparkContext named WordCount that runs locally using
// all available cores.
val conf = new SparkConf().setAppName("WordCount")
conf.setMaster("local[*]")
val sc = new SparkContext(conf)
MY RESEARCH:
I have referred to the source code and found that the import statements are in
their required jars.
For example
SparkConf is present in package org.apache.spark
which is mentioned in the program.
https://github.com/apache/spark/blob/v2.3.1/core/src/main/scala/org/apache/spark/SparkConf.scala
ERRORS I AM FACING :
Wordcount.scala:3: error: **object apache is not a member of package org import org.apache.spark._ ^**
Wordcount.scala:4: error: **object apache is not a member of package org import org.apache.spark.SparkConf** ^
Wordcount.scala:14: error: not found: **type SparkConf val conf = new SparkConf().setAppName("WordCount")** ^
Wordcount.scala:16: error: not found: **type SparkContext val sc = new SparkContext(conf)** ^
four errors found